WO2024208919A1 - Extended reality interface for robotic arm - Google Patents
Extended reality interface for robotic arm Download PDFInfo
- Publication number
- WO2024208919A1 WO2024208919A1 PCT/EP2024/059089 EP2024059089W WO2024208919A1 WO 2024208919 A1 WO2024208919 A1 WO 2024208919A1 EP 2024059089 W EP2024059089 W EP 2024059089W WO 2024208919 A1 WO2024208919 A1 WO 2024208919A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- user
- trajectory
- content
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
- G05B2219/36453—Handheld tool like probe
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39451—Augmented reality for robot programming
Definitions
- Example aspects herein relate to robotics, and in particular to extended reality, XR, device for a user controlling a robot, a system, a method for controlling a robot, and a computer program.
- Robots and in particular robots having a robotic arm, are used in various industries to process products.
- programming the movement of a robot requires advanced skills and knowledge, such as computer programming skills, computer process control and the calculation of advanced angles, positions etc. These requirements limit the potential use of robots.
- an extended reality, XR, device for a user controlling a robot comprising a robotic arm and an end effector.
- the XR device comprises a visual device configured to display an extended reality, XR, content to said user.
- said XR content represents an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot.
- the XR device comprises an input module configured to receive a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
- the XR device comprises a communication module configured to transmit a signal for causing said at least a part of said robot to move substantially along said new trajectory.
- the visual device may be any suitable device (e.g. headset with integrated display) to be worn or otherwise used by the user) to visual an XR content.
- the XR content may be any content displayed to a user, and including at least one 2D or 3D virtual element of digital information.
- the at least one virtual element may be superimposed onto a real environment, either captured or viewed live by the user, the at least one virtual element may be superimposed onto a virtual environment, or the at least one virtual element may represent a virtual environment.
- the XR content may be any of an augmented reality (AR) content, a mixed reality (MR) content, or a virtual reality (VR) content.
- AR augmented reality
- MR mixed reality
- VR virtual reality
- the displayed environment is a real environment, which may be captured by imaging means.
- the imaging means may be located on the visual device (to correspond to a point of view of the user using the visual device) or may be located near the robot and be oriented towards the robot to capture images of the robot.
- the visual device may comprise transparent lenses allowing the user to view the real environment.
- the real environment may be shown to the user in real-time (or with negligible delay such as a few milliseconds between a capture of image and a display to the user).
- the AR or MR content may be superimposed on an image of the environment (captured by the imaging means) or projected on the transparent lenses so that the user views the AR or MR content and the real environment.
- a virtual environment is displayed.
- the virtual environment may be generated by a computer model of a real environment drawn or measured (e.g. using a LIDAR or other point cloud technology).
- the input module may be any suitable device held or otherwise used by the user, which allows the user to interact with the XR content. Specifically, a virtual pointer corresponding to the input module is displayed in the XR content. As the user moves the input module, the movement of the input module is captured (e.g. using sensors such as accelerometers in the input module and/or a system placed around the user to monitor movements of the input module), and the position of the virtual pointer moves according to the movement of the input module.
- manipulating should be understood to mean any interaction by the user with the virtual element representing the trajectory of at least a part of the robot, using any suitable input mechanism.
- the planned trajectory indicates a pre-programmed trajectory that the robot would follow when operating unless the user modifies the trajectory.
- the planned trajectory may be a trajectory of an ongoing movement of the robot or a movement that has not yet initiated.
- This trajectory may be for a specific part of the robot (e.g. a movement of a segment of the robotic arm, a rotation of a joint of the robotic arm, a movement, a rotation, or an operation of the end effector), or it may be a trajectory for multiple parts, or substantially the entire robotic arm (or entire robot).
- a specific part of the robot e.g. a movement of a segment of the robotic arm, a rotation of a joint of the robotic arm, a movement, a rotation, or an operation of the end effector
- it may be a trajectory for multiple parts, or substantially the entire robotic arm (or entire robot).
- substantially along the new trajectory it is intended to mean that the trajectory taken by the at least a part of the robot may coincide with the trajectory defined by the user input as closely as possible, or it may involve some modifications to the trajectory defined by the user input, as will be explained in more detail below.
- the communication module may transmit the signal via any suitable communication link, either directly to the robot or to another device that controls the movement of the robot.
- the XR device displaying the XR content provides an XR interface for the user controlling the robot.
- the XR device (and the XR interface provided by the XR device) improves user experience when controlling robots.
- said planned trajectory is a planned trajectory of at least one of said robotic arm and said end effector.
- said XR content further comprises a virtual representation of said at least a part of said robot, more preferably of said robot.
- said visual device is configured to display a movement of said virtual representation in accordance with at least one of said planned trajectory and said new trajectory.
- the user may visualize how the robot would move (according to the planned trajectory or the new trajectory) before the robot begins moving, thus improving the user's experience.
- said input module is configured to receive a user input indicating a selection between two or more of: said XR content displaying a virtual representation of said robot entirely, said XR content displaying a a virtual representation of one or more parts of said robot selected by said user, and said XR content not displaying any virtual representation of said robot or part of said robot.
- said input module to receive a user input indicating a selection between said XR content displaying an animation of a virtual representation of at least a part of said robot, and said XR content displaying a still virtual representation of at least a part of said robot.
- said input module receives a second user input for triggering said communication module to transmit said signal.
- the user may select a time when the at least a part of the robot should move in accordance with the defined new trajectory, thus allowing the user to modify the trajectory as desired.
- said input module is configured to receive said user input as the robot is moving along the planned trajectory, and said communication module transmits a signal that causes said at least a part of said robot to modify an ongoing movement.
- the user may control ongoing operations of the robot in real-time, thus allowing the user to correct potential issues with the planned trajectory.
- said XR device further comprises a control module configured to cooperate with said input module to define said new trajectory in accordance with a set of predetermined constraints defined for at least one of said environment and said robot.
- the trajectory defined by the user may be modified to meet the predetermined constraints, for example to avoid collision between the robot and elements of the environment, to avoid damage to the robot or to an object handled by the robot, etc. Accordingly, the user does not need to follow the predetermined constraints when defining a new trajectory (or check whether the new trajectory satisfies all predetermined constraints) as the control module makes this determination, which therefore improves user experience when defining the desired trajectory, whilst ensuring predetermined constraints are satisfied.
- said set of predetermined constraints comprises at least one of a physical constraint and an operational constraint.
- the set may comprise any number (e.g. more than one) of physical constraint and any number than one operational constraint.
- the physical constraint(s) may depend on an operational limit of the robot, or the physical constraint(s) may depend on the physical environment of the robot.
- Physical constraint(s) depending on an operational limit of the robot may include, for example, a limit to possible configurations of the robot (e.g. a limit to rotations or extension of parts of the robot).
- Physical constraint(s) depending on the physical environment of the robot may include one or more spaces (two or three dimensional) around the robot that the robot cannot enter (for example zones where obstacles are present or are likely to be present).
- the operational constraint(s) depend on an operation that the robot is to perform, and may depend for example on an object to be processed by the end effector, such as a preferred orientation of the object when moved by the robotic arm (e.g. to prevent spillage or damage to the object), a restriction in movement in a particular direction when the object is held by the robotic arm, etc.
- constraint(s) may be predetermined to ensure the operation of the robot does not cause damage to the object being processed, to the robot, or to the environment of the robot.
- said control module is configured to cooperate with said input module to modify said new trajectory, to reduce at least one of: an amount of energy required for said at least a part of said robot to reach an end point of said new trajectory, a risk of collision of at least one of said robot and an object manipulated by said robot with said environment, and an effect that a movement of said at least a part of said robot substantially along said new trajectory would have on an object manipulated by said robot.
- the new trajectory that is to be followed by the robot can be optimized, in terms of energy resources and/or risk reduction.
- said communication module is configured to communicate with a robot module on which said robot is placed.
- said robot module comprises a plurality of sensors, preferably configured to sense a space in a vicinity of said robot and said robot module for the presence of any object.
- the XR device and the module may exchange information related to the current environment of the robot placed on the robot module.
- the XR device may provide information to the robot module on the new trajectory of the robot, which may use this information to adapt the sensing of the space in the vicinity of the robot and the module.
- said communication module is configured to obtain data related to at least one of said robot and said space from said robot module.
- the data obtained from the robot module may be indicating a fault occurring in the robot or in the robot module, a sensed object to be represented in the XR content, or may a potential collision with the sensed object.
- the data obtained from the robot module may be used to present suitable information to the user as part of the XR content, and/or to modify the trajectory in accordance with the current environment of the robot.
- said visual device is configured to modify said virtual element in accordance with said received user input substantially contemporaneously with said user manipulating said virtual element.
- substantially contemporaneously with it is intended to mean that the modification is displayed simultaneously or with a negligible delay (e.g. a few milliseconds).
- the manipulation performed by the user may be instantaneously shown to the user, thus allowing the user to better define the new trajectory.
- said virtual element comprises at least one of a curve and a number of points in a virtual three dimensional space.
- the curve may be a line or a plurality of segments that may be linked.
- the virtual element may also comprise a plurality of curves separate from each other (e.g. showing only parts of the trajectory) or a number (one or more) of points that should be reached by the part of the robot in a given order.
- the user may easily visualize the trajectory and identify all or a set of points that the part of the robot should travel to.
- said user input modifies said virtual element by dragging, or by moving a part of said virtual element in said environment represented by said XR content.
- the part may be a point or a section of a curve representing the trajectory.
- the user may therefore selectively modify a part of the trajectory, whilst the rest of the trajectory is either substantially unchanged or is adapted to be in conformity with the modified part.
- the user may easily modify only a part of the trajectory.
- said XR device comprises means to alert said user of an issue related to at least one of said new trajectory and said robot.
- the issue that is alerted may indicate that the new trajectory should be rectified, for example if it fails to comply with a predetermined constraint, or the issue may indicate a technical issue with the robot that should be attended to by the user, for example to repair, replace or maintain the robot (or the robot module).
- said means to alert comprise at least one of audio, visual and haptic means.
- the visual device may be coupled to loudspeakers, earphones etc. to alert the user.
- the visual alert may be provided as part of the XR content, or (in case the XR content is AR or MR content, it may be a visual alert placed in a location that is likely to be in the field of view of the user (for example on the input module), so that the user viewing the XR content is likely to view the visual alert.
- the haptic means may be provided via the input means or any other part of the XR device in contact with the user (e.g. the headset worn by the user), and may generate a vibration to alert the user.
- a system comprising the XR device according to the first example aspect disclosed herein, and said robot.
- the system comprises said robot module.
- a method for a user controlling a robot comprising a robotic arm and an end effector.
- the method comprises displaying an extended reality, XR, content by means of a visual device, said XR content preferably representing an environment in which at least one of said user and said robot is located, and preferably comprising a virtual element representing a planned trajectory for at least a part of said robot.
- the method comprises receiving a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
- the method comprises transmitting a signal for causing said at least a part of said robot to move substantially along said new trajectory.
- said planned trajectory is a planned trajectory of at least one of said robotic arm and said end effector.
- said XR content further comprises a virtual representation of said at least a part of said robot.
- the method further includes displaying a movement of said virtual representation in accordance with at least one of said planned trajectory and said new trajectory.
- the method further comprises modifying said new trajectory in accordance with a set of predetermined constraints defined for at least one of said environment and said robot.
- said set of predetermined constraints comprises at least one of a physical constraint and an operational constraint.
- said modifying said new trajectory includes reducing at least one of: an amount of energy required for said at least a part of said robot to reach an end point of said new trajectory, a risk of collision of at least one of said robot and an object manipulated by said robot with said environment, and an effect that a movement of said at least a part of said robot along said new trajectory would have on an object manipulated by said robot.
- a computer program comprising instructions which, when executed by one or more processor, cause said one or more processor to carry the method according to the third example aspect disclosed herein.
- a method for a user controlling a robot comprising a robotic arm and an end effector, the method comprising causing a visual device to display an extended reality, XR, content to said user, said XR visual device, said XR content representing an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot; receiving a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot; transmitting a signal for causing said at least a part of said robot to move substantially along said new trajectory.
- XR extended reality
- a computer program comprising instructions which, when executed by one or more processor, cause said one or more processor to carry the method according to the fifth example aspect disclosed herein.
- Figure 1 shows a schematic view illustrating an example of a system in example embodiments
- Figure 2 shows a schematic diagram illustrating an example of an XR device in example embodiments
- Figure 3 shows a schematic view illustrating an example of an XR device and an XR content displayed to a user, in example embodiments
- Figure 4 shows a schematic diagram illustrating an example of an XR device in example embodiments
- Figures 5 to 8 show schematic views illustrating examples of XR content displayed to a user, in example embodiments
- Figures 9a and 9b show schematic views illustrating an example of a robot module coupled to a robot, in example embodiments
- Figure 10 shows a schematic diagram illustrating an example of a general kind of programmable processing apparatus that may be used to implement a control module in example embodiments;
- Figure 11 shows processing operations performed in example embodiments.
- Each communication link may be, for example, a wireless communication link (for example a Wi-Fi, cellular telephone data link such as LTE/5G, Bluetooth or Bluetooth Low Energy (BLE)) or a wired communication link (e.g. DSL, fibre-optic cable, Ethernet etc.)
- a wireless communication link for example a Wi-Fi, cellular telephone data link such as LTE/5G, Bluetooth or Bluetooth Low Energy (BLE)
- BLE Bluetooth Low Energy
- Each communication link may not be permanent.
- the following description will simply refer to elements transmitting or receiving signals, information, or data, which should be understood to be via a communication link established between these elements (either directly or via one or more intermediary elements acting as a relay).
- the system comprises an XR device 10, a robot 20 and a robot module 30.
- the XR device 10 comprises a visual device 120 (shown on the example of Figure 1 in the form of a wearable headset), and an input module 110 (shown on the example of Figure 1 in the form of a remote controller with a button).
- the user 60 holds the input module 110, and views an XR content through the visual device 120.
- a communication module 130 may be co-located with the visual device 120 (e.g. in the wearable headset as shown on Figure 1).
- the robot 20 comprises at least a robotic arm 22 having a number of segments and an end effector 24.
- the end effector 24 may be used for a range of processes, where the process may be at least one of picking and placing objects (e.g. pick- n-placing), assembling products, additive manufacturing (e.g. 3D printing), visual detecting (e.g. detecting a set of points in an environment using imaging/sensing means and using the detected set of points to identify/optimise a trajectory of the end effector in the environment, or to determine how the end effector should contact an object corresponding to the set of points).
- picking and placing objects e.g. pick- n-placing
- additive manufacturing e.g. 3D printing
- visual detecting e.g. detecting a set of points in an environment using imaging/sensing means and using the detected set of points to identify/optimise a trajectory of the end effector in the environment, or to determine how the end effector should contact an object corresponding to the set of points.
- the robot module 30 is communicatively coupled to the robot 20 and comprises a plurality of sensors.
- the robot module 30 provides one or more functions such as detecting objects in a space in a vicinity of the robot 20 and controlling the robot 20 to avoid or minimise an effect of a collision with a detected object, or facilitating communication between the robot 20 and the XR device 10 by converting data to/from the robot into a data format that is compatible with the XR device 10.
- the top part of Figure 1 shows an example of an XR content 40 that may be displayed to the user.
- the XR content is a mixed reality (MR) content, which is captured using stereoscopic imaging means provided on the visual device 120 and oriented so as to coincide with the view of the user 60.
- the orientation of the visual device 120 is monitored using a monitoring system (not shown on Figure 1) detecting changes in movement of the visual device 120 due to movements of the head of the user 60.
- MR mixed reality
- the MR content includes one or more real elements being displayed, such as the robot 20 and the robot module 30, and a number of virtual elements being displayed, such as a planned trajectory 41 of the end effector 24, a number of points 42 along the planned trajectory, and a pointer 43 corresponding to the input module 110, the user 60 being able to control the position of the pointer 43 within the MR content by moving the input module 110.
- Figure 1 shows a virtual elements for a planned trajectory of a single part of the robot, this is purely for simplicity.
- the XR content 40 may comprise virtual elements (e.g. curves or points) representing a planned trajectory for a plurality of parts of the robot, for example a trajectory for each rotatable joint of the robotic arm 22.
- Each of these virtual elements may be manipulable by the user.
- the XR device 10 comprises an input module 110, a visual device 120, and a communication module 130. As explained above, each of these elements of the XR device 10 may be communicatively coupled to each other using any suitable communication link.
- the input module 110 is controlled by the user as an input mechanism to provide instructions to the XR device 10.
- the input module 110 comprises, or is coupled with, means for detecting a movement of the input module 110 that is imparted by the user holding the input module 110. For example, when the user moves the input module 110, this movement is detected (e.g. using sensors in the input module 110 or a separate monitoring system monitoring the position and orientation of the input module 110).
- the input module 110 comprises a number of input means such as buttons, sliders, triggers etc.
- Each input means has a predetermined (or user configurable) associated operation having an effect on the XR content.
- one of the input means e.g. a button
- a second input means e.g. a slider
- a third input means may indicate a desired switch between two different display mode (e.g. with different levels of information being displayed) etc.
- a user operation e.g. a movement and/or an operation of one or more of the input means
- a first signal indicating this user operation is transmitted to the visual device 120.
- the visual device 120 displays an XR content to the user, using a display or by projecting virtual elements of XR content on optical elements that are located in the user's view.
- the visual device 120 displays a virtual pointer corresponding to the input module 110, and a virtual element representing a planned trajectory of at least a part of the robot 20 (e.g. of a part of the robotic arm, or of the end effector).
- the visual device 120 determines whether the user operation indicated by the first signal requires a modification to the displayed XR content.
- the visual device 120 modifies the XR content in accordance with the user operation. For example, if the user operation indicates a movement of the input module 110, the visual device 120 causes a virtual pointer corresponding to the input module 110 to move in accordance with the movement of the input module 110. Additionally, if the first signal indicates that the user operated one of the input means, the visual device 120 can modify the displayed XR content in accordance with any user operation corresponding to the operated input means, for example by adjusting a zoom level, toggling between different modes of displaying information, etc.
- the visual device 120 also determines whether the first signal indicates that a user manipulation (e.g. a click-and-drag operation) performed on the virtual element representing the planned trajectory of the robot.
- a user manipulation e.g. a click-and-drag operation
- the visual device 120 modifies the virtual element in accordance with the user manipulation, and transmits a second signal to the communication module 130 indicating the user manipulation and corresponding change to the virtual element.
- the communication module 130 comprises means for communicating with at least the robot 20.
- the communication module 130 may also communicate with an external device.
- the communication module 130 Upon receiving the second signal, the communication module 130 generates a third signal for causing the robot 20 to perform the desired change to the operation in accordance with the user manipulation.
- This third signal may be transmitted to the robot 20 directly, or it may be transmitted to an external device, separate from the robot 20 and the XR device 10, which in turn controls the robot 20.
- signals between the input module 110, the visual device 120 and the communication module 130, or signals transmitted from the communication module 130 to cause an operation of the robot 20 may be transmitted recurringly.
- the first signal may be generated recurringly, for example every millisecond, at least when a movement of the input module 110 or a user operation on an input means is detected, such that the visual device 120 receives the first signal and modifies the XR content as the user moves the input module 110 or operates the input means. Accordingly, the user may receive visual feedback of the current operation on the input module 110.
- a delay between the user operation and the corresponding modification to the XR content is below or near a human perception threshold, such that the user feels the XR content changes simultaneously with the operation on the input module 110.
- the visual device 120 displays an XR content 40 to the user.
- the XR content 40 comprises real elements including the robot 20, an object 50 to be processed by the robot 20 and a working surface 51 on which the object 50 is to be moved.
- the XR content 40 also includes a number of virtual elements including a planned trajectory of the robot 20 (in this case, the planned trajectory of the end effector), when the robot 20 picks up the object 50 from an initial position to a final position (as indicated by element 50', which may be a virtual representation of the object 50.
- any number of real elements or virtual elements may be displayed instead, and real elements may be omitted (for examples in case of a virtual content being displayed).
- the XR content 40 also includes a virtual pointer 43 which moves as the user moves the input module 110.
- the robot 20 receives a signal instructing the object 50 to be picked-up and moved along trajectory 44 to the final position 50'.
- the XR device 11 comprises a control module 140.
- the control module 140 transmits a signal to the visual device 120 indicating the XR content to be displayed to the user.
- the input module 110 transmits the first signal to the control module 140, instead of the visual device 120.
- the control module 140 determines whether the first signal indicates that a user manipulation (e.g. a click-and-drag operation) performed on the virtual element representing the planned trajectory of the robot.
- a user manipulation e.g. a click-and-drag operation
- the control module 140 determines a modification required to the displayed XR content and transmits a signal to the visual device 120 to cause the visual device 120 to display the modified XR content.
- control module 140 determines whether the new trajectory defined by the user manipulation satisfies a number of predetermined operational and physical constraints that are defined for the robot 20 and for the environment.
- the control module 140 determines a modification required to the displayed XR content and transmits a signal to the visual device 120 to cause the visual device to display the modified XR content. Then, upon receiving another signal from the input module 110 indicating the user confirmed the displayed new trajectory (e.g. a second user input), the control module 140 transmits another signal to the communication module 130 to cause the communication module 130 to transmit the signal for causing the robot 20 to move along the new trajectory.
- control module 140 either modifies the new trajectory to comply with the predetermined constraints or alerts the user that the robot 20 cannot proceed along the new trajectory, thus allowing the user to re-define a new trajectory.
- control module 140 and the input module 110 cooperate to define a new trajectory in accordance with the predetermined constraints.
- control module 140 may cause the visual device 120 to display an XR content comprising virtual elements (e.g. curves or points) representing a planned trajectory for a plurality of parts of the robot, for example a trajectory for each rotatable joint of the robotic arm 22.
- virtual elements e.g. curves or points
- Each of these virtual elements may be manipulable by the user.
- each joint may be rotatable by a different amount, or each extendable segment of the robotic arm may have a different load capacity.
- control module 140 may determine whether the indicated user manipulation satisfies the set of constraints predetermined for the corresponding part of the robot.
- the XR content 40 shown on Figure 5 includes the robot 20, and a virtual representation of the robot 20' (e.g. a virtual twin of the robot).
- the XR content 40 comprises a virtual element 41 representing a planned trajectory of the robot.
- a virtual element representing the new trajectory defined by the user manipulation may be displayed on the XR content 40, instead of or in addition to the virtual element 41representing the planned trajectory.
- the XR content 40 shows an animation of the virtual representation 20' of the robot 20 as it moves along the trajectory 41, thus allowing the user to visualize how the robot 20 would move when instructed.
- the XR content 40 may show a virtual representation of only a part of the robot 20 (e.g. the part corresponding to the trajectory 41), which may be animated as the represented part moves along the trajectory 41.
- the user may select between the XR content displaying a virtual representation of the entire robot 20, a virtual representation of one or more parts of the robot 20 selected by the user (e.g. only the end effector 24, or the end effector 24 and a rotatable joint connected to the end effector 24), or not viewing any virtual representation of the robot 20.
- the user may also select between any virtual representation of a (part of) the robot 20 being animated or not. User selections may be performed, for example, via input means provided on the input module 110.
- the XR content 40 shown on Figure 6 includes the robot 20, and a virtual element 41 representing the planned trajectory of the robot.
- the XR content includes a new trajectory 44 ending at a new final position 46, that the user defined by drawing the new desired trajectory 44 (i.e. not by dragging a part of the virtual element 41).
- the control module 140 determines that the new trajectory does not comply with a predetermined physical constraint.
- control module 140 causes the visual device 120 to display a visual alert 47 which notifies the user of the issue.
- the XR content 40 shown on Figure 7 includes the robot 20, and a virtual element 44 representing the new trajectory of the robot 20 defined by user manipulation, ending at a final position 46.
- control module 140 determines that the new trajectory represented by the virtual element 44 does not comply with an operational constraint, namely that an amount of energy required for the robot 20 to reach the final position 46 is not minimized (in other word, that the trajectory can be optimized in terms of energy requirements).
- control module 140 modifies the new trajectory to minimize the required energy, causes the visual device 120 to include in the XR content 40 a virtual element 48 representing the new trajectory modified to minimize the required energy.
- the control module 140 may then (either when the trajectory represented by the virtual element 48 is approved by the user or automatically), transmit a signal to the communication module 130 which in turn transmits a signal for causing the robot 20 to move along the trajectory represented by the virtual element 48. Because the modified new trajectory (represented by the virtual element 48) has the same end position 46 as the new trajectory defined by the user (represented by the virtual element 44), a movement of the robot 20 along one of the is considered to be substantially along the other of the trajectories.
- the XR content 40 shown on Figure 8 includes the robot 20, an object 50 to be picked-up by the robot 20, and an object 52 present in the environment of the robot.
- the XR content 40 comprises a virtual element 44 representing the new trajectory of the robot 20 defined by user manipulation.
- the control module 140 determines that the new trajectory represented by the virtual element 44 does not comply with a predetermined constraint, namely that a collision with the environment (and specifically objects in the environment) should be avoided.
- control module 140 modifies the new trajectory to avoid the collision, and causes the visual device 120 to include in the XR content 40 a virtual element 48 representing the new trajectory modified to avoid a collision.
- the robot 20 may then move the object 50 according to the new modified trajectory represented by the virtual element 48.
- the robot 20 includes a robotic arm 22, an end effector 24 and a base 26.
- the robot module 30 comprises a plurality of sensors 32, an interface unit 34. With the sensors 32, the robot module 30 may detect objects in a space in a vicinity of the robot 20 and controlling the robot 20 to avoid or minimise an effect of a collision with a detected object.
- the robot module 30 may facilitate communication between the robot 20 and the XR device 10 by converting data to/from the robot into a data format that is compatible with the XR device 10.
- the interface unit 34 is located on a top surface of the robot module 30 and is mechanically coupled to the robot 20 via the base 26 of the robot, using for example fastening means such as bolts and thread throughs.
- the top surface of the robot module 30 has a substantially octagonal shape, and a bottom surface of the robot module 30 has a substantially square shape.
- the robot module 30 has eight side surfaces (four triangular and four trapezoidal) joining the top surface and the bottom surface. Between the top surface and the bottom surface (defining planes Pl and P2, respectively), the robot module 30 defines a taper which is narrower at the top surface than the bottom surface.
- Each of these sides includes a respective sensor 32, although Figure 9a only shows four of these sensors.
- each sensor 32 is facing a respective direction away from the robot module 30 (and the robot 20), and can therefore detect a space around the robot module 30 and the robot 20 for any object that should not collide with the robot 20.
- the robot module 30 has a tapered outer shape, which is narrower towards the top.
- a cross section along a horizonal plane of the robot module 30 has a smaller area towards the top of the robot module 30 than towards the bottom of the robot module 30. Accordingly, more components of the robot module 30 may be placed towards the bottom (i.e. towards plane P2), thus lowering the center of gravity of the robot module 30, which in turn improves its stability.
- Each sensor 32 may comprises comprise one or more cameras (e.g. IR, RGB/visible or multispectral cameras), infrared, ultrasonic sensors, LIDARs or other suitable sensors.
- the sensors 32 comprise a plurality of RGB cameras.
- each sensor 32 comprises at least an RGB camera (i.e. a camera capturing visible light), and processing means for identifying object(s) on a captured image and for estimating a distance from the sensor 32 to each identified object. Based on this distance, the sensor 32 may determine whether each object is within a predetermined space in a vicinity of the robot and the module.
- Figure 9b shows a top view of the robot 20 and the robot module 30. For illustrative purpose, only a part of the robot 20 is shown on Figure 9b.
- each sensor 32 By orienting each sensor 32 in a different direction, a space 36 in the vicinity of the robot 20 and the robot module 30 may be sensed for objects.
- FIG. 10 an example of a general kind of programmable processing apparatus 70 that may be used in various components of the XR device 10 or 11 described herein, such as the input module 110, the visual device 120, the communication module 130, or the control module 140, is shown.
- the programmable processing apparatus 70 comprises one or more processors 71, one or more input/output communication modules 72, one or more working memories 73, and one or more instruction stores 74 storing computer-readable instructions which can be executed by one or more processors 71 to perform the processing operations as described hereinafter.
- An instruction store 74 is a non-transitory storage medium, which may comprise a non-volatile memory, for example in the form of a read-only-memory (ROM), a flash memory, a magnetic computer storage device (for example a hard disk) or an optical disk, which is pre-loaded with the computer-readable instructions.
- an instruction store 74 may comprise writeable memory, such as random access memory (RAM) and the computer-readable instructions can be input thereto from a computer program product, such as a non-transitory computer-readable storage medium 75 (for example an optical disk such as a CD-ROM, DVD-ROM, etc.) or a computer-readable signal 76 carrying the computer-readable instructions.
- a non-transitory computer-readable storage medium 75 for example an optical disk such as a CD-ROM, DVD-ROM, etc.
- a computer-readable signal 76 carrying the computer-readable instructions.
- the combination 77 of hardware components shown in Figure 3 and the computer-readable instructions are configured to implement the functionality of the control module 140
- operations caused when one or more processors 71 executes instructions stored in an instruction store 74 are described generally as operations performed by one of the input module 110, the visual device 120, the communication module 130, the control module 140, or by elements of the robot 20 or the robot module 30.
- the information processing apparatus displays an XR content by means of an extended reality, XR, visual device, said XR content representing an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot.
- the information processing apparatus causes the XR visual device to display said XR content.
- the information processing apparatus receives a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
- the information processing apparatus transmits a signal for causing said at least a part of said robot to move substantially along said new trajectory.
- the robot 20 comprises one robotic arm with an end effect.
- the robotic arm may comprise more than one end effector.
- any suitable robot having at least one robotic arm may be used, such as robots comprising more than one robotic arm (each with one or more end effectors) and may comprise other elements.
- the robot module 30 comprises a plurality of sensors 32. However, these may be omitted, for example if the robot module 30 does not provide a functionality of detecting objects in a vicinity of the robot 20.
- the robot is placed on a robot module 30. However, this is non-limiting as the robot module 30 may be omitted, and the robot 20 may be affixed to any other suitable support.
- the displayed XR content 40 is a mixed reality content using an environment captured by imaging means on the visual device 120.
- the imaging means need not be located on the visual device 120 and may instead by at a predetermined location (e.g. on a wall, or mounted on a tripod) near the robot 20 and oriented towards the robot 20.
- the imaging means may be omitted, for example if the visual device 120 comprises means such as transparent lenses for letting light from the environment through to the user, on which virtual elements may be projected to augment the environment viewed by the user.
- the XR content 40 may instead be an augmented reality content or a virtual reality content.
- the XR content 40 includes a number of virtual elements, for example the planned trajectory 41, the points 42 along the planned trajectory, the pointer 43, the virtual representation 50' of the object 50, the virtual representation 20' of the robot (or one or more parts of the robot).
- a user input may be received to selectively hide any of these virtual elements from being displayed on the XR content 40, for example via the input module 110. Additionally, the user may select which part(s) of the robot should have a virtual representation displayed on the XR content 40.
- the user input received via the input module 110 may activate or interrupt any animation being displayed in the XR content 40, such as an animation of the virtual representation 20' of the robot 20 (or one or more parts of the robot selected by the user).
- the planned trajectory is for a movement of at least a part of the robot 20 that has not yet initiated, and which initiates when the input module 110 receives a second user input indicating that the movement should begin.
- the communication module 130 may transmit the signal causing the movement of the at least a part of the robot substantially along the new trajectory without waiting for a second user input to be received (i.e. without receiving a confirmation from the user that the new trajectory is approved).
- the XR device comprises one input module 110.
- more than one input module may be provided instead.
- two input modules 110 each held in one hand of the user 60.
- a larger number of input modules may be provided, and the user may select which to operate at any given time.
- the input module 110 transmits a first signal to the visual device 120, which in turns transmits a second signal to the communication module 130 , or the input module 110 transmits the first signal to the control module 140.
- the input module 110 may instead transmits the first signal (e.g. broadcast or multicast the first signal) to the communication module 130 and/or the control module 140 as well as the visual device 120, or instead of the visual device.
- the visual device 120 receiving the first signal may merely adapt the XR content 40 being displayed. If the visual device does not receive the first signal, the communication module 130 or the control module 140 may transmit a separate signal to the visual device 120 indicating a required modification to the displayed XR content.
- the input module 110 is used to manipulate a planned trajectory of at least a part of the robot 20.
- the input module 110 may be operated (e.g. by pressing one or more buttons on the input module 110) to indicate a new desired operation of the robot 20 (i.e. other than manipulating a planned trajectory), such as instructing an end effector 24 having a suction cup applies a vacuum to grip an object, or to interrupt an applied vacuum to release an object, to cause a 3D printing head to begin heating and extruding a filament, etc.
- the first signal may be transmitted to the communication module 130 and/or the control module 140 directly, rather than to the visual device 120.
- a visual alert is provided to the user via the XR content displayed by the visual device to indicate that the new trajectory desired by the user cannot be followed.
- a sound and or haptic alert indicating that the new trajectory cannot be followed may be used to notify the user, instead or in addition to the visual alert.
- the visual, sound and/or haptic alert may, instead of indicating an issue related to a user input, indicate a fault or other problem of the robot, for example that a component of the robot requires a repair or that a movement of the robot is blocked, and that the user should investigate.
- the example aspects described here avoid limitations, specifically rooted in computer technology, relating to the control of robots having at least one robotic arm.
- the user experience when controlling robots may be improved.
- the example aspects described herein improve computers and computer processing/functionality, and also improve the field(s) of at least robot control and data processing.
- Software embodiments of the examples presented herein may be provided as, a computer program, or software, such as one or more programs having instructions or sequences of instructions, included or stored in an article of manufacture such as a machine-accessible or machine-readable medium, an instruction store, or computer- readable storage device, each of which can be non-transitory, in one example embodiment.
- the program or instructions on the non-transitory machine-accessible medium, machine-readable medium, instruction store, or computer-readable storage device may be used to program a computer system or other electronic device.
- the machine- or computer-readable medium, instruction store, and storage device may include, but are not limited to, floppy diskettes, optical disks, and magneto-optical disks or other types of media/machine-readable medium/instruction store/storage device suitable for storing or transmitting electronic instructions.
- the techniques described herein are not limited to any particular software configuration. They may find applicability in any computing or processing environment.
- computer-readable shall include any medium that is capable of storing, encoding, or transmitting instructions or a sequence of instructions for execution by the machine, computer, or computer processor and that causes the machine/computer/computer processor to perform any one of the methods described herein.
- Such expressions are merely a shorthand way of stating that the execution of the software by a processing system causes the processor to perform an action to produce a result.
- Some embodiments may also be implemented by the preparation of applicationspecific integrated circuits, field-programmable gate arrays, or by interconnecting an appropriate network of conventional component circuits.
- Some embodiments include a computer program product.
- the computer program product may be a storage medium or media, instruction store(s), or storage device(s), having instructions stored thereon or therein which can be used to control, or cause, a computer or computer processor to perform a ny of the procedures of the example embodiments described herein.
- the storage medium/instruction store/storage device may include, by example and without limitation, an optical disc, a ROM, a RAM, an EPROM, an EEPROM, a DRAM, a VRAM, a flash memory, a flash card, a magnetic card, an optical card, nano systems, a molecular memory integrated circuit, a RAID, remote data storage/archive/warehousing, and/or any other type of device suitable for storing instructions and/or data.
- some implementations include software for controlling both the hardware of the system and for enabling the system or microprocessor to interact with a human user or other mechanism utilizing the results of the example embodiments described herein.
- Such software may include without limitation device drivers, operating systems, and user applications.
- Such computer-readable media or storage device(s) further include software for performing example aspects of the invention, as described above.
- a module includes software, although in other example embodiments herein, a module includes hardware, or a combination of hardware and software.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP24717173.9A EP4688341A1 (en) | 2023-04-05 | 2024-04-03 | Extended reality interface for robotic arm |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IT102023000006741 | 2023-04-05 | ||
| IT102023000006741A IT202300006741A1 (en) | 2023-04-05 | 2023-04-05 | Extended Reality Interface for Robotic Arm |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024208919A1 true WO2024208919A1 (en) | 2024-10-10 |
Family
ID=86732780
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/059089 Ceased WO2024208919A1 (en) | 2023-04-05 | 2024-04-03 | Extended reality interface for robotic arm |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4688341A1 (en) |
| IT (1) | IT202300006741A1 (en) |
| WO (1) | WO2024208919A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140277737A1 (en) * | 2013-03-18 | 2014-09-18 | Kabushiki Kaisha Yaskawa Denki | Robot device and method for manufacturing processing object |
| US20210170603A1 (en) * | 2018-04-19 | 2021-06-10 | Yuanda Robotics Gmbh | Method for using a multi-link actuated mechanism, preferably a robot, particularly preferably an articulated robot, by a user by means of a mobile display apparatus |
| WO2022161637A1 (en) * | 2021-02-01 | 2022-08-04 | Abb Schweiz Ag | Visualization of a robot motion path and its use in robot path planning |
-
2023
- 2023-04-05 IT IT102023000006741A patent/IT202300006741A1/en unknown
-
2024
- 2024-04-03 EP EP24717173.9A patent/EP4688341A1/en active Pending
- 2024-04-03 WO PCT/EP2024/059089 patent/WO2024208919A1/en not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140277737A1 (en) * | 2013-03-18 | 2014-09-18 | Kabushiki Kaisha Yaskawa Denki | Robot device and method for manufacturing processing object |
| US20210170603A1 (en) * | 2018-04-19 | 2021-06-10 | Yuanda Robotics Gmbh | Method for using a multi-link actuated mechanism, preferably a robot, particularly preferably an articulated robot, by a user by means of a mobile display apparatus |
| WO2022161637A1 (en) * | 2021-02-01 | 2022-08-04 | Abb Schweiz Ag | Visualization of a robot motion path and its use in robot path planning |
Non-Patent Citations (1)
| Title |
|---|
| REINHART G ET AL: "A programming system for robot-based remote-laser-welding with conventional optics", CIRP ANNALS, ELSEVIER BV, NL, CH, FR, vol. 57, no. 1, 1 January 2008 (2008-01-01), pages 37 - 40, XP022674597, ISSN: 0007-8506, [retrieved on 20080509], DOI: 10.1016/J.CIRP.2008.03.120 * |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4688341A1 (en) | 2026-02-11 |
| IT202300006741A1 (en) | 2024-10-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9987744B2 (en) | Generating a grasp pose for grasping of an object by a grasping end effector of a robot | |
| US10105843B2 (en) | Robot device, remote control method of robot device, and program | |
| CN110888428B (en) | Mobile robot, remote terminal, computer readable medium, control system, control method | |
| US8965579B2 (en) | Interfacing with a mobile telepresence robot | |
| KR102859547B1 (en) | Robot system and Control method of the same | |
| US20150273689A1 (en) | Robot control device, robot, robotic system, teaching method, and program | |
| CN114728413A (en) | Method and system for controlling graphical user interface of remote robot | |
| CN121411450A (en) | System and method for initializing a robot to autonomously travel along a training route | |
| CN112638593A (en) | Augmented reality visualization techniques for robotic pick-up systems | |
| CN114341930B (en) | Image processing device, camera, robot, and robot system | |
| US20250249589A1 (en) | Systems and methods for teleoperated robot | |
| JP7517803B2 (en) | ROBOT TEACHING SYSTEM, IMAGE GENERATION METHOD, AND PROGRAM | |
| US20140285633A1 (en) | Robotic system and image display device | |
| KR20130072748A (en) | Device and method foruser interaction | |
| US11992960B2 (en) | Robot system, control apparatus of robot system, control method of robot system, imaging apparatus, and storage medium | |
| JPWO2022124398A5 (en) | ||
| EP3946825A1 (en) | Method and control arrangement for determining a relation between a robot coordinate system and a movable apparatus coordinate system | |
| JP2026503080A (en) | Method and system for calibrating a camera | |
| JP6842441B2 (en) | Programs and information processing equipment | |
| US11407117B1 (en) | Robot centered augmented reality system | |
| WO2024208919A1 (en) | Extended reality interface for robotic arm | |
| JP6657858B2 (en) | Robot operation system | |
| KR101975556B1 (en) | Apparatus of controlling observation view of robot | |
| EP3599539B1 (en) | Rendering objects in virtual views | |
| US12095964B2 (en) | Information processing apparatus, information processing method, and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24717173 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024717173 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2024717173 Country of ref document: EP Effective date: 20251105 |
|
| ENP | Entry into the national phase |
Ref document number: 2024717173 Country of ref document: EP Effective date: 20251105 |
|
| ENP | Entry into the national phase |
Ref document number: 2024717173 Country of ref document: EP Effective date: 20251105 |
|
| ENP | Entry into the national phase |
Ref document number: 2024717173 Country of ref document: EP Effective date: 20251105 |
|
| ENP | Entry into the national phase |
Ref document number: 2024717173 Country of ref document: EP Effective date: 20251105 |