[go: up one dir, main page]

WO2024208919A1 - Extended reality interface for robotic arm - Google Patents

Extended reality interface for robotic arm Download PDF

Info

Publication number
WO2024208919A1
WO2024208919A1 PCT/EP2024/059089 EP2024059089W WO2024208919A1 WO 2024208919 A1 WO2024208919 A1 WO 2024208919A1 EP 2024059089 W EP2024059089 W EP 2024059089W WO 2024208919 A1 WO2024208919 A1 WO 2024208919A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
user
trajectory
content
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2024/059089
Other languages
French (fr)
Inventor
Valentina SUMINI
Francesca MINCIGRUCCI
Stefano SINIGARDI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GD SpA
Original Assignee
GD SpA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GD SpA filed Critical GD SpA
Priority to EP24717173.9A priority Critical patent/EP4688341A1/en
Publication of WO2024208919A1 publication Critical patent/WO2024208919A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36453Handheld tool like probe
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39451Augmented reality for robot programming

Definitions

  • Example aspects herein relate to robotics, and in particular to extended reality, XR, device for a user controlling a robot, a system, a method for controlling a robot, and a computer program.
  • Robots and in particular robots having a robotic arm, are used in various industries to process products.
  • programming the movement of a robot requires advanced skills and knowledge, such as computer programming skills, computer process control and the calculation of advanced angles, positions etc. These requirements limit the potential use of robots.
  • an extended reality, XR, device for a user controlling a robot comprising a robotic arm and an end effector.
  • the XR device comprises a visual device configured to display an extended reality, XR, content to said user.
  • said XR content represents an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot.
  • the XR device comprises an input module configured to receive a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
  • the XR device comprises a communication module configured to transmit a signal for causing said at least a part of said robot to move substantially along said new trajectory.
  • the visual device may be any suitable device (e.g. headset with integrated display) to be worn or otherwise used by the user) to visual an XR content.
  • the XR content may be any content displayed to a user, and including at least one 2D or 3D virtual element of digital information.
  • the at least one virtual element may be superimposed onto a real environment, either captured or viewed live by the user, the at least one virtual element may be superimposed onto a virtual environment, or the at least one virtual element may represent a virtual environment.
  • the XR content may be any of an augmented reality (AR) content, a mixed reality (MR) content, or a virtual reality (VR) content.
  • AR augmented reality
  • MR mixed reality
  • VR virtual reality
  • the displayed environment is a real environment, which may be captured by imaging means.
  • the imaging means may be located on the visual device (to correspond to a point of view of the user using the visual device) or may be located near the robot and be oriented towards the robot to capture images of the robot.
  • the visual device may comprise transparent lenses allowing the user to view the real environment.
  • the real environment may be shown to the user in real-time (or with negligible delay such as a few milliseconds between a capture of image and a display to the user).
  • the AR or MR content may be superimposed on an image of the environment (captured by the imaging means) or projected on the transparent lenses so that the user views the AR or MR content and the real environment.
  • a virtual environment is displayed.
  • the virtual environment may be generated by a computer model of a real environment drawn or measured (e.g. using a LIDAR or other point cloud technology).
  • the input module may be any suitable device held or otherwise used by the user, which allows the user to interact with the XR content. Specifically, a virtual pointer corresponding to the input module is displayed in the XR content. As the user moves the input module, the movement of the input module is captured (e.g. using sensors such as accelerometers in the input module and/or a system placed around the user to monitor movements of the input module), and the position of the virtual pointer moves according to the movement of the input module.
  • manipulating should be understood to mean any interaction by the user with the virtual element representing the trajectory of at least a part of the robot, using any suitable input mechanism.
  • the planned trajectory indicates a pre-programmed trajectory that the robot would follow when operating unless the user modifies the trajectory.
  • the planned trajectory may be a trajectory of an ongoing movement of the robot or a movement that has not yet initiated.
  • This trajectory may be for a specific part of the robot (e.g. a movement of a segment of the robotic arm, a rotation of a joint of the robotic arm, a movement, a rotation, or an operation of the end effector), or it may be a trajectory for multiple parts, or substantially the entire robotic arm (or entire robot).
  • a specific part of the robot e.g. a movement of a segment of the robotic arm, a rotation of a joint of the robotic arm, a movement, a rotation, or an operation of the end effector
  • it may be a trajectory for multiple parts, or substantially the entire robotic arm (or entire robot).
  • substantially along the new trajectory it is intended to mean that the trajectory taken by the at least a part of the robot may coincide with the trajectory defined by the user input as closely as possible, or it may involve some modifications to the trajectory defined by the user input, as will be explained in more detail below.
  • the communication module may transmit the signal via any suitable communication link, either directly to the robot or to another device that controls the movement of the robot.
  • the XR device displaying the XR content provides an XR interface for the user controlling the robot.
  • the XR device (and the XR interface provided by the XR device) improves user experience when controlling robots.
  • said planned trajectory is a planned trajectory of at least one of said robotic arm and said end effector.
  • said XR content further comprises a virtual representation of said at least a part of said robot, more preferably of said robot.
  • said visual device is configured to display a movement of said virtual representation in accordance with at least one of said planned trajectory and said new trajectory.
  • the user may visualize how the robot would move (according to the planned trajectory or the new trajectory) before the robot begins moving, thus improving the user's experience.
  • said input module is configured to receive a user input indicating a selection between two or more of: said XR content displaying a virtual representation of said robot entirely, said XR content displaying a a virtual representation of one or more parts of said robot selected by said user, and said XR content not displaying any virtual representation of said robot or part of said robot.
  • said input module to receive a user input indicating a selection between said XR content displaying an animation of a virtual representation of at least a part of said robot, and said XR content displaying a still virtual representation of at least a part of said robot.
  • said input module receives a second user input for triggering said communication module to transmit said signal.
  • the user may select a time when the at least a part of the robot should move in accordance with the defined new trajectory, thus allowing the user to modify the trajectory as desired.
  • said input module is configured to receive said user input as the robot is moving along the planned trajectory, and said communication module transmits a signal that causes said at least a part of said robot to modify an ongoing movement.
  • the user may control ongoing operations of the robot in real-time, thus allowing the user to correct potential issues with the planned trajectory.
  • said XR device further comprises a control module configured to cooperate with said input module to define said new trajectory in accordance with a set of predetermined constraints defined for at least one of said environment and said robot.
  • the trajectory defined by the user may be modified to meet the predetermined constraints, for example to avoid collision between the robot and elements of the environment, to avoid damage to the robot or to an object handled by the robot, etc. Accordingly, the user does not need to follow the predetermined constraints when defining a new trajectory (or check whether the new trajectory satisfies all predetermined constraints) as the control module makes this determination, which therefore improves user experience when defining the desired trajectory, whilst ensuring predetermined constraints are satisfied.
  • said set of predetermined constraints comprises at least one of a physical constraint and an operational constraint.
  • the set may comprise any number (e.g. more than one) of physical constraint and any number than one operational constraint.
  • the physical constraint(s) may depend on an operational limit of the robot, or the physical constraint(s) may depend on the physical environment of the robot.
  • Physical constraint(s) depending on an operational limit of the robot may include, for example, a limit to possible configurations of the robot (e.g. a limit to rotations or extension of parts of the robot).
  • Physical constraint(s) depending on the physical environment of the robot may include one or more spaces (two or three dimensional) around the robot that the robot cannot enter (for example zones where obstacles are present or are likely to be present).
  • the operational constraint(s) depend on an operation that the robot is to perform, and may depend for example on an object to be processed by the end effector, such as a preferred orientation of the object when moved by the robotic arm (e.g. to prevent spillage or damage to the object), a restriction in movement in a particular direction when the object is held by the robotic arm, etc.
  • constraint(s) may be predetermined to ensure the operation of the robot does not cause damage to the object being processed, to the robot, or to the environment of the robot.
  • said control module is configured to cooperate with said input module to modify said new trajectory, to reduce at least one of: an amount of energy required for said at least a part of said robot to reach an end point of said new trajectory, a risk of collision of at least one of said robot and an object manipulated by said robot with said environment, and an effect that a movement of said at least a part of said robot substantially along said new trajectory would have on an object manipulated by said robot.
  • the new trajectory that is to be followed by the robot can be optimized, in terms of energy resources and/or risk reduction.
  • said communication module is configured to communicate with a robot module on which said robot is placed.
  • said robot module comprises a plurality of sensors, preferably configured to sense a space in a vicinity of said robot and said robot module for the presence of any object.
  • the XR device and the module may exchange information related to the current environment of the robot placed on the robot module.
  • the XR device may provide information to the robot module on the new trajectory of the robot, which may use this information to adapt the sensing of the space in the vicinity of the robot and the module.
  • said communication module is configured to obtain data related to at least one of said robot and said space from said robot module.
  • the data obtained from the robot module may be indicating a fault occurring in the robot or in the robot module, a sensed object to be represented in the XR content, or may a potential collision with the sensed object.
  • the data obtained from the robot module may be used to present suitable information to the user as part of the XR content, and/or to modify the trajectory in accordance with the current environment of the robot.
  • said visual device is configured to modify said virtual element in accordance with said received user input substantially contemporaneously with said user manipulating said virtual element.
  • substantially contemporaneously with it is intended to mean that the modification is displayed simultaneously or with a negligible delay (e.g. a few milliseconds).
  • the manipulation performed by the user may be instantaneously shown to the user, thus allowing the user to better define the new trajectory.
  • said virtual element comprises at least one of a curve and a number of points in a virtual three dimensional space.
  • the curve may be a line or a plurality of segments that may be linked.
  • the virtual element may also comprise a plurality of curves separate from each other (e.g. showing only parts of the trajectory) or a number (one or more) of points that should be reached by the part of the robot in a given order.
  • the user may easily visualize the trajectory and identify all or a set of points that the part of the robot should travel to.
  • said user input modifies said virtual element by dragging, or by moving a part of said virtual element in said environment represented by said XR content.
  • the part may be a point or a section of a curve representing the trajectory.
  • the user may therefore selectively modify a part of the trajectory, whilst the rest of the trajectory is either substantially unchanged or is adapted to be in conformity with the modified part.
  • the user may easily modify only a part of the trajectory.
  • said XR device comprises means to alert said user of an issue related to at least one of said new trajectory and said robot.
  • the issue that is alerted may indicate that the new trajectory should be rectified, for example if it fails to comply with a predetermined constraint, or the issue may indicate a technical issue with the robot that should be attended to by the user, for example to repair, replace or maintain the robot (or the robot module).
  • said means to alert comprise at least one of audio, visual and haptic means.
  • the visual device may be coupled to loudspeakers, earphones etc. to alert the user.
  • the visual alert may be provided as part of the XR content, or (in case the XR content is AR or MR content, it may be a visual alert placed in a location that is likely to be in the field of view of the user (for example on the input module), so that the user viewing the XR content is likely to view the visual alert.
  • the haptic means may be provided via the input means or any other part of the XR device in contact with the user (e.g. the headset worn by the user), and may generate a vibration to alert the user.
  • a system comprising the XR device according to the first example aspect disclosed herein, and said robot.
  • the system comprises said robot module.
  • a method for a user controlling a robot comprising a robotic arm and an end effector.
  • the method comprises displaying an extended reality, XR, content by means of a visual device, said XR content preferably representing an environment in which at least one of said user and said robot is located, and preferably comprising a virtual element representing a planned trajectory for at least a part of said robot.
  • the method comprises receiving a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
  • the method comprises transmitting a signal for causing said at least a part of said robot to move substantially along said new trajectory.
  • said planned trajectory is a planned trajectory of at least one of said robotic arm and said end effector.
  • said XR content further comprises a virtual representation of said at least a part of said robot.
  • the method further includes displaying a movement of said virtual representation in accordance with at least one of said planned trajectory and said new trajectory.
  • the method further comprises modifying said new trajectory in accordance with a set of predetermined constraints defined for at least one of said environment and said robot.
  • said set of predetermined constraints comprises at least one of a physical constraint and an operational constraint.
  • said modifying said new trajectory includes reducing at least one of: an amount of energy required for said at least a part of said robot to reach an end point of said new trajectory, a risk of collision of at least one of said robot and an object manipulated by said robot with said environment, and an effect that a movement of said at least a part of said robot along said new trajectory would have on an object manipulated by said robot.
  • a computer program comprising instructions which, when executed by one or more processor, cause said one or more processor to carry the method according to the third example aspect disclosed herein.
  • a method for a user controlling a robot comprising a robotic arm and an end effector, the method comprising causing a visual device to display an extended reality, XR, content to said user, said XR visual device, said XR content representing an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot; receiving a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot; transmitting a signal for causing said at least a part of said robot to move substantially along said new trajectory.
  • XR extended reality
  • a computer program comprising instructions which, when executed by one or more processor, cause said one or more processor to carry the method according to the fifth example aspect disclosed herein.
  • Figure 1 shows a schematic view illustrating an example of a system in example embodiments
  • Figure 2 shows a schematic diagram illustrating an example of an XR device in example embodiments
  • Figure 3 shows a schematic view illustrating an example of an XR device and an XR content displayed to a user, in example embodiments
  • Figure 4 shows a schematic diagram illustrating an example of an XR device in example embodiments
  • Figures 5 to 8 show schematic views illustrating examples of XR content displayed to a user, in example embodiments
  • Figures 9a and 9b show schematic views illustrating an example of a robot module coupled to a robot, in example embodiments
  • Figure 10 shows a schematic diagram illustrating an example of a general kind of programmable processing apparatus that may be used to implement a control module in example embodiments;
  • Figure 11 shows processing operations performed in example embodiments.
  • Each communication link may be, for example, a wireless communication link (for example a Wi-Fi, cellular telephone data link such as LTE/5G, Bluetooth or Bluetooth Low Energy (BLE)) or a wired communication link (e.g. DSL, fibre-optic cable, Ethernet etc.)
  • a wireless communication link for example a Wi-Fi, cellular telephone data link such as LTE/5G, Bluetooth or Bluetooth Low Energy (BLE)
  • BLE Bluetooth Low Energy
  • Each communication link may not be permanent.
  • the following description will simply refer to elements transmitting or receiving signals, information, or data, which should be understood to be via a communication link established between these elements (either directly or via one or more intermediary elements acting as a relay).
  • the system comprises an XR device 10, a robot 20 and a robot module 30.
  • the XR device 10 comprises a visual device 120 (shown on the example of Figure 1 in the form of a wearable headset), and an input module 110 (shown on the example of Figure 1 in the form of a remote controller with a button).
  • the user 60 holds the input module 110, and views an XR content through the visual device 120.
  • a communication module 130 may be co-located with the visual device 120 (e.g. in the wearable headset as shown on Figure 1).
  • the robot 20 comprises at least a robotic arm 22 having a number of segments and an end effector 24.
  • the end effector 24 may be used for a range of processes, where the process may be at least one of picking and placing objects (e.g. pick- n-placing), assembling products, additive manufacturing (e.g. 3D printing), visual detecting (e.g. detecting a set of points in an environment using imaging/sensing means and using the detected set of points to identify/optimise a trajectory of the end effector in the environment, or to determine how the end effector should contact an object corresponding to the set of points).
  • picking and placing objects e.g. pick- n-placing
  • additive manufacturing e.g. 3D printing
  • visual detecting e.g. detecting a set of points in an environment using imaging/sensing means and using the detected set of points to identify/optimise a trajectory of the end effector in the environment, or to determine how the end effector should contact an object corresponding to the set of points.
  • the robot module 30 is communicatively coupled to the robot 20 and comprises a plurality of sensors.
  • the robot module 30 provides one or more functions such as detecting objects in a space in a vicinity of the robot 20 and controlling the robot 20 to avoid or minimise an effect of a collision with a detected object, or facilitating communication between the robot 20 and the XR device 10 by converting data to/from the robot into a data format that is compatible with the XR device 10.
  • the top part of Figure 1 shows an example of an XR content 40 that may be displayed to the user.
  • the XR content is a mixed reality (MR) content, which is captured using stereoscopic imaging means provided on the visual device 120 and oriented so as to coincide with the view of the user 60.
  • the orientation of the visual device 120 is monitored using a monitoring system (not shown on Figure 1) detecting changes in movement of the visual device 120 due to movements of the head of the user 60.
  • MR mixed reality
  • the MR content includes one or more real elements being displayed, such as the robot 20 and the robot module 30, and a number of virtual elements being displayed, such as a planned trajectory 41 of the end effector 24, a number of points 42 along the planned trajectory, and a pointer 43 corresponding to the input module 110, the user 60 being able to control the position of the pointer 43 within the MR content by moving the input module 110.
  • Figure 1 shows a virtual elements for a planned trajectory of a single part of the robot, this is purely for simplicity.
  • the XR content 40 may comprise virtual elements (e.g. curves or points) representing a planned trajectory for a plurality of parts of the robot, for example a trajectory for each rotatable joint of the robotic arm 22.
  • Each of these virtual elements may be manipulable by the user.
  • the XR device 10 comprises an input module 110, a visual device 120, and a communication module 130. As explained above, each of these elements of the XR device 10 may be communicatively coupled to each other using any suitable communication link.
  • the input module 110 is controlled by the user as an input mechanism to provide instructions to the XR device 10.
  • the input module 110 comprises, or is coupled with, means for detecting a movement of the input module 110 that is imparted by the user holding the input module 110. For example, when the user moves the input module 110, this movement is detected (e.g. using sensors in the input module 110 or a separate monitoring system monitoring the position and orientation of the input module 110).
  • the input module 110 comprises a number of input means such as buttons, sliders, triggers etc.
  • Each input means has a predetermined (or user configurable) associated operation having an effect on the XR content.
  • one of the input means e.g. a button
  • a second input means e.g. a slider
  • a third input means may indicate a desired switch between two different display mode (e.g. with different levels of information being displayed) etc.
  • a user operation e.g. a movement and/or an operation of one or more of the input means
  • a first signal indicating this user operation is transmitted to the visual device 120.
  • the visual device 120 displays an XR content to the user, using a display or by projecting virtual elements of XR content on optical elements that are located in the user's view.
  • the visual device 120 displays a virtual pointer corresponding to the input module 110, and a virtual element representing a planned trajectory of at least a part of the robot 20 (e.g. of a part of the robotic arm, or of the end effector).
  • the visual device 120 determines whether the user operation indicated by the first signal requires a modification to the displayed XR content.
  • the visual device 120 modifies the XR content in accordance with the user operation. For example, if the user operation indicates a movement of the input module 110, the visual device 120 causes a virtual pointer corresponding to the input module 110 to move in accordance with the movement of the input module 110. Additionally, if the first signal indicates that the user operated one of the input means, the visual device 120 can modify the displayed XR content in accordance with any user operation corresponding to the operated input means, for example by adjusting a zoom level, toggling between different modes of displaying information, etc.
  • the visual device 120 also determines whether the first signal indicates that a user manipulation (e.g. a click-and-drag operation) performed on the virtual element representing the planned trajectory of the robot.
  • a user manipulation e.g. a click-and-drag operation
  • the visual device 120 modifies the virtual element in accordance with the user manipulation, and transmits a second signal to the communication module 130 indicating the user manipulation and corresponding change to the virtual element.
  • the communication module 130 comprises means for communicating with at least the robot 20.
  • the communication module 130 may also communicate with an external device.
  • the communication module 130 Upon receiving the second signal, the communication module 130 generates a third signal for causing the robot 20 to perform the desired change to the operation in accordance with the user manipulation.
  • This third signal may be transmitted to the robot 20 directly, or it may be transmitted to an external device, separate from the robot 20 and the XR device 10, which in turn controls the robot 20.
  • signals between the input module 110, the visual device 120 and the communication module 130, or signals transmitted from the communication module 130 to cause an operation of the robot 20 may be transmitted recurringly.
  • the first signal may be generated recurringly, for example every millisecond, at least when a movement of the input module 110 or a user operation on an input means is detected, such that the visual device 120 receives the first signal and modifies the XR content as the user moves the input module 110 or operates the input means. Accordingly, the user may receive visual feedback of the current operation on the input module 110.
  • a delay between the user operation and the corresponding modification to the XR content is below or near a human perception threshold, such that the user feels the XR content changes simultaneously with the operation on the input module 110.
  • the visual device 120 displays an XR content 40 to the user.
  • the XR content 40 comprises real elements including the robot 20, an object 50 to be processed by the robot 20 and a working surface 51 on which the object 50 is to be moved.
  • the XR content 40 also includes a number of virtual elements including a planned trajectory of the robot 20 (in this case, the planned trajectory of the end effector), when the robot 20 picks up the object 50 from an initial position to a final position (as indicated by element 50', which may be a virtual representation of the object 50.
  • any number of real elements or virtual elements may be displayed instead, and real elements may be omitted (for examples in case of a virtual content being displayed).
  • the XR content 40 also includes a virtual pointer 43 which moves as the user moves the input module 110.
  • the robot 20 receives a signal instructing the object 50 to be picked-up and moved along trajectory 44 to the final position 50'.
  • the XR device 11 comprises a control module 140.
  • the control module 140 transmits a signal to the visual device 120 indicating the XR content to be displayed to the user.
  • the input module 110 transmits the first signal to the control module 140, instead of the visual device 120.
  • the control module 140 determines whether the first signal indicates that a user manipulation (e.g. a click-and-drag operation) performed on the virtual element representing the planned trajectory of the robot.
  • a user manipulation e.g. a click-and-drag operation
  • the control module 140 determines a modification required to the displayed XR content and transmits a signal to the visual device 120 to cause the visual device 120 to display the modified XR content.
  • control module 140 determines whether the new trajectory defined by the user manipulation satisfies a number of predetermined operational and physical constraints that are defined for the robot 20 and for the environment.
  • the control module 140 determines a modification required to the displayed XR content and transmits a signal to the visual device 120 to cause the visual device to display the modified XR content. Then, upon receiving another signal from the input module 110 indicating the user confirmed the displayed new trajectory (e.g. a second user input), the control module 140 transmits another signal to the communication module 130 to cause the communication module 130 to transmit the signal for causing the robot 20 to move along the new trajectory.
  • control module 140 either modifies the new trajectory to comply with the predetermined constraints or alerts the user that the robot 20 cannot proceed along the new trajectory, thus allowing the user to re-define a new trajectory.
  • control module 140 and the input module 110 cooperate to define a new trajectory in accordance with the predetermined constraints.
  • control module 140 may cause the visual device 120 to display an XR content comprising virtual elements (e.g. curves or points) representing a planned trajectory for a plurality of parts of the robot, for example a trajectory for each rotatable joint of the robotic arm 22.
  • virtual elements e.g. curves or points
  • Each of these virtual elements may be manipulable by the user.
  • each joint may be rotatable by a different amount, or each extendable segment of the robotic arm may have a different load capacity.
  • control module 140 may determine whether the indicated user manipulation satisfies the set of constraints predetermined for the corresponding part of the robot.
  • the XR content 40 shown on Figure 5 includes the robot 20, and a virtual representation of the robot 20' (e.g. a virtual twin of the robot).
  • the XR content 40 comprises a virtual element 41 representing a planned trajectory of the robot.
  • a virtual element representing the new trajectory defined by the user manipulation may be displayed on the XR content 40, instead of or in addition to the virtual element 41representing the planned trajectory.
  • the XR content 40 shows an animation of the virtual representation 20' of the robot 20 as it moves along the trajectory 41, thus allowing the user to visualize how the robot 20 would move when instructed.
  • the XR content 40 may show a virtual representation of only a part of the robot 20 (e.g. the part corresponding to the trajectory 41), which may be animated as the represented part moves along the trajectory 41.
  • the user may select between the XR content displaying a virtual representation of the entire robot 20, a virtual representation of one or more parts of the robot 20 selected by the user (e.g. only the end effector 24, or the end effector 24 and a rotatable joint connected to the end effector 24), or not viewing any virtual representation of the robot 20.
  • the user may also select between any virtual representation of a (part of) the robot 20 being animated or not. User selections may be performed, for example, via input means provided on the input module 110.
  • the XR content 40 shown on Figure 6 includes the robot 20, and a virtual element 41 representing the planned trajectory of the robot.
  • the XR content includes a new trajectory 44 ending at a new final position 46, that the user defined by drawing the new desired trajectory 44 (i.e. not by dragging a part of the virtual element 41).
  • the control module 140 determines that the new trajectory does not comply with a predetermined physical constraint.
  • control module 140 causes the visual device 120 to display a visual alert 47 which notifies the user of the issue.
  • the XR content 40 shown on Figure 7 includes the robot 20, and a virtual element 44 representing the new trajectory of the robot 20 defined by user manipulation, ending at a final position 46.
  • control module 140 determines that the new trajectory represented by the virtual element 44 does not comply with an operational constraint, namely that an amount of energy required for the robot 20 to reach the final position 46 is not minimized (in other word, that the trajectory can be optimized in terms of energy requirements).
  • control module 140 modifies the new trajectory to minimize the required energy, causes the visual device 120 to include in the XR content 40 a virtual element 48 representing the new trajectory modified to minimize the required energy.
  • the control module 140 may then (either when the trajectory represented by the virtual element 48 is approved by the user or automatically), transmit a signal to the communication module 130 which in turn transmits a signal for causing the robot 20 to move along the trajectory represented by the virtual element 48. Because the modified new trajectory (represented by the virtual element 48) has the same end position 46 as the new trajectory defined by the user (represented by the virtual element 44), a movement of the robot 20 along one of the is considered to be substantially along the other of the trajectories.
  • the XR content 40 shown on Figure 8 includes the robot 20, an object 50 to be picked-up by the robot 20, and an object 52 present in the environment of the robot.
  • the XR content 40 comprises a virtual element 44 representing the new trajectory of the robot 20 defined by user manipulation.
  • the control module 140 determines that the new trajectory represented by the virtual element 44 does not comply with a predetermined constraint, namely that a collision with the environment (and specifically objects in the environment) should be avoided.
  • control module 140 modifies the new trajectory to avoid the collision, and causes the visual device 120 to include in the XR content 40 a virtual element 48 representing the new trajectory modified to avoid a collision.
  • the robot 20 may then move the object 50 according to the new modified trajectory represented by the virtual element 48.
  • the robot 20 includes a robotic arm 22, an end effector 24 and a base 26.
  • the robot module 30 comprises a plurality of sensors 32, an interface unit 34. With the sensors 32, the robot module 30 may detect objects in a space in a vicinity of the robot 20 and controlling the robot 20 to avoid or minimise an effect of a collision with a detected object.
  • the robot module 30 may facilitate communication between the robot 20 and the XR device 10 by converting data to/from the robot into a data format that is compatible with the XR device 10.
  • the interface unit 34 is located on a top surface of the robot module 30 and is mechanically coupled to the robot 20 via the base 26 of the robot, using for example fastening means such as bolts and thread throughs.
  • the top surface of the robot module 30 has a substantially octagonal shape, and a bottom surface of the robot module 30 has a substantially square shape.
  • the robot module 30 has eight side surfaces (four triangular and four trapezoidal) joining the top surface and the bottom surface. Between the top surface and the bottom surface (defining planes Pl and P2, respectively), the robot module 30 defines a taper which is narrower at the top surface than the bottom surface.
  • Each of these sides includes a respective sensor 32, although Figure 9a only shows four of these sensors.
  • each sensor 32 is facing a respective direction away from the robot module 30 (and the robot 20), and can therefore detect a space around the robot module 30 and the robot 20 for any object that should not collide with the robot 20.
  • the robot module 30 has a tapered outer shape, which is narrower towards the top.
  • a cross section along a horizonal plane of the robot module 30 has a smaller area towards the top of the robot module 30 than towards the bottom of the robot module 30. Accordingly, more components of the robot module 30 may be placed towards the bottom (i.e. towards plane P2), thus lowering the center of gravity of the robot module 30, which in turn improves its stability.
  • Each sensor 32 may comprises comprise one or more cameras (e.g. IR, RGB/visible or multispectral cameras), infrared, ultrasonic sensors, LIDARs or other suitable sensors.
  • the sensors 32 comprise a plurality of RGB cameras.
  • each sensor 32 comprises at least an RGB camera (i.e. a camera capturing visible light), and processing means for identifying object(s) on a captured image and for estimating a distance from the sensor 32 to each identified object. Based on this distance, the sensor 32 may determine whether each object is within a predetermined space in a vicinity of the robot and the module.
  • Figure 9b shows a top view of the robot 20 and the robot module 30. For illustrative purpose, only a part of the robot 20 is shown on Figure 9b.
  • each sensor 32 By orienting each sensor 32 in a different direction, a space 36 in the vicinity of the robot 20 and the robot module 30 may be sensed for objects.
  • FIG. 10 an example of a general kind of programmable processing apparatus 70 that may be used in various components of the XR device 10 or 11 described herein, such as the input module 110, the visual device 120, the communication module 130, or the control module 140, is shown.
  • the programmable processing apparatus 70 comprises one or more processors 71, one or more input/output communication modules 72, one or more working memories 73, and one or more instruction stores 74 storing computer-readable instructions which can be executed by one or more processors 71 to perform the processing operations as described hereinafter.
  • An instruction store 74 is a non-transitory storage medium, which may comprise a non-volatile memory, for example in the form of a read-only-memory (ROM), a flash memory, a magnetic computer storage device (for example a hard disk) or an optical disk, which is pre-loaded with the computer-readable instructions.
  • an instruction store 74 may comprise writeable memory, such as random access memory (RAM) and the computer-readable instructions can be input thereto from a computer program product, such as a non-transitory computer-readable storage medium 75 (for example an optical disk such as a CD-ROM, DVD-ROM, etc.) or a computer-readable signal 76 carrying the computer-readable instructions.
  • a non-transitory computer-readable storage medium 75 for example an optical disk such as a CD-ROM, DVD-ROM, etc.
  • a computer-readable signal 76 carrying the computer-readable instructions.
  • the combination 77 of hardware components shown in Figure 3 and the computer-readable instructions are configured to implement the functionality of the control module 140
  • operations caused when one or more processors 71 executes instructions stored in an instruction store 74 are described generally as operations performed by one of the input module 110, the visual device 120, the communication module 130, the control module 140, or by elements of the robot 20 or the robot module 30.
  • the information processing apparatus displays an XR content by means of an extended reality, XR, visual device, said XR content representing an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot.
  • the information processing apparatus causes the XR visual device to display said XR content.
  • the information processing apparatus receives a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
  • the information processing apparatus transmits a signal for causing said at least a part of said robot to move substantially along said new trajectory.
  • the robot 20 comprises one robotic arm with an end effect.
  • the robotic arm may comprise more than one end effector.
  • any suitable robot having at least one robotic arm may be used, such as robots comprising more than one robotic arm (each with one or more end effectors) and may comprise other elements.
  • the robot module 30 comprises a plurality of sensors 32. However, these may be omitted, for example if the robot module 30 does not provide a functionality of detecting objects in a vicinity of the robot 20.
  • the robot is placed on a robot module 30. However, this is non-limiting as the robot module 30 may be omitted, and the robot 20 may be affixed to any other suitable support.
  • the displayed XR content 40 is a mixed reality content using an environment captured by imaging means on the visual device 120.
  • the imaging means need not be located on the visual device 120 and may instead by at a predetermined location (e.g. on a wall, or mounted on a tripod) near the robot 20 and oriented towards the robot 20.
  • the imaging means may be omitted, for example if the visual device 120 comprises means such as transparent lenses for letting light from the environment through to the user, on which virtual elements may be projected to augment the environment viewed by the user.
  • the XR content 40 may instead be an augmented reality content or a virtual reality content.
  • the XR content 40 includes a number of virtual elements, for example the planned trajectory 41, the points 42 along the planned trajectory, the pointer 43, the virtual representation 50' of the object 50, the virtual representation 20' of the robot (or one or more parts of the robot).
  • a user input may be received to selectively hide any of these virtual elements from being displayed on the XR content 40, for example via the input module 110. Additionally, the user may select which part(s) of the robot should have a virtual representation displayed on the XR content 40.
  • the user input received via the input module 110 may activate or interrupt any animation being displayed in the XR content 40, such as an animation of the virtual representation 20' of the robot 20 (or one or more parts of the robot selected by the user).
  • the planned trajectory is for a movement of at least a part of the robot 20 that has not yet initiated, and which initiates when the input module 110 receives a second user input indicating that the movement should begin.
  • the communication module 130 may transmit the signal causing the movement of the at least a part of the robot substantially along the new trajectory without waiting for a second user input to be received (i.e. without receiving a confirmation from the user that the new trajectory is approved).
  • the XR device comprises one input module 110.
  • more than one input module may be provided instead.
  • two input modules 110 each held in one hand of the user 60.
  • a larger number of input modules may be provided, and the user may select which to operate at any given time.
  • the input module 110 transmits a first signal to the visual device 120, which in turns transmits a second signal to the communication module 130 , or the input module 110 transmits the first signal to the control module 140.
  • the input module 110 may instead transmits the first signal (e.g. broadcast or multicast the first signal) to the communication module 130 and/or the control module 140 as well as the visual device 120, or instead of the visual device.
  • the visual device 120 receiving the first signal may merely adapt the XR content 40 being displayed. If the visual device does not receive the first signal, the communication module 130 or the control module 140 may transmit a separate signal to the visual device 120 indicating a required modification to the displayed XR content.
  • the input module 110 is used to manipulate a planned trajectory of at least a part of the robot 20.
  • the input module 110 may be operated (e.g. by pressing one or more buttons on the input module 110) to indicate a new desired operation of the robot 20 (i.e. other than manipulating a planned trajectory), such as instructing an end effector 24 having a suction cup applies a vacuum to grip an object, or to interrupt an applied vacuum to release an object, to cause a 3D printing head to begin heating and extruding a filament, etc.
  • the first signal may be transmitted to the communication module 130 and/or the control module 140 directly, rather than to the visual device 120.
  • a visual alert is provided to the user via the XR content displayed by the visual device to indicate that the new trajectory desired by the user cannot be followed.
  • a sound and or haptic alert indicating that the new trajectory cannot be followed may be used to notify the user, instead or in addition to the visual alert.
  • the visual, sound and/or haptic alert may, instead of indicating an issue related to a user input, indicate a fault or other problem of the robot, for example that a component of the robot requires a repair or that a movement of the robot is blocked, and that the user should investigate.
  • the example aspects described here avoid limitations, specifically rooted in computer technology, relating to the control of robots having at least one robotic arm.
  • the user experience when controlling robots may be improved.
  • the example aspects described herein improve computers and computer processing/functionality, and also improve the field(s) of at least robot control and data processing.
  • Software embodiments of the examples presented herein may be provided as, a computer program, or software, such as one or more programs having instructions or sequences of instructions, included or stored in an article of manufacture such as a machine-accessible or machine-readable medium, an instruction store, or computer- readable storage device, each of which can be non-transitory, in one example embodiment.
  • the program or instructions on the non-transitory machine-accessible medium, machine-readable medium, instruction store, or computer-readable storage device may be used to program a computer system or other electronic device.
  • the machine- or computer-readable medium, instruction store, and storage device may include, but are not limited to, floppy diskettes, optical disks, and magneto-optical disks or other types of media/machine-readable medium/instruction store/storage device suitable for storing or transmitting electronic instructions.
  • the techniques described herein are not limited to any particular software configuration. They may find applicability in any computing or processing environment.
  • computer-readable shall include any medium that is capable of storing, encoding, or transmitting instructions or a sequence of instructions for execution by the machine, computer, or computer processor and that causes the machine/computer/computer processor to perform any one of the methods described herein.
  • Such expressions are merely a shorthand way of stating that the execution of the software by a processing system causes the processor to perform an action to produce a result.
  • Some embodiments may also be implemented by the preparation of applicationspecific integrated circuits, field-programmable gate arrays, or by interconnecting an appropriate network of conventional component circuits.
  • Some embodiments include a computer program product.
  • the computer program product may be a storage medium or media, instruction store(s), or storage device(s), having instructions stored thereon or therein which can be used to control, or cause, a computer or computer processor to perform a ny of the procedures of the example embodiments described herein.
  • the storage medium/instruction store/storage device may include, by example and without limitation, an optical disc, a ROM, a RAM, an EPROM, an EEPROM, a DRAM, a VRAM, a flash memory, a flash card, a magnetic card, an optical card, nano systems, a molecular memory integrated circuit, a RAID, remote data storage/archive/warehousing, and/or any other type of device suitable for storing instructions and/or data.
  • some implementations include software for controlling both the hardware of the system and for enabling the system or microprocessor to interact with a human user or other mechanism utilizing the results of the example embodiments described herein.
  • Such software may include without limitation device drivers, operating systems, and user applications.
  • Such computer-readable media or storage device(s) further include software for performing example aspects of the invention, as described above.
  • a module includes software, although in other example embodiments herein, a module includes hardware, or a combination of hardware and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

An extended reality, XR, device for a user controlling a robot comprising a robotic arm and an end effector, the XR device comprising: a visual device configured to display an extended reality, XR, content to said user, said XR content representing an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot; an input module configured to receive a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot; and a communication module configured to transmit a signal for causing said at least a part of said robot to move substantially along said new trajectory. A system, a method, and a computer program are also provided.

Description

EXTENDED REALITY INTERFACE FOR ROBOTIC ARM
Technical field
Example aspects herein relate to robotics, and in particular to extended reality, XR, device for a user controlling a robot, a system, a method for controlling a robot, and a computer program.
Background
Robots, and in particular robots having a robotic arm, are used in various industries to process products. However, programming the movement of a robot requires advanced skills and knowledge, such as computer programming skills, computer process control and the calculation of advanced angles, positions etc. These requirements limit the potential use of robots.
It would therefore be advantageous to improve user experience when controlling robots. This is particularly useful for robots involved in industrial processes involving a collaboration between a human operator and a robot (in which case, the robot is also referred to as a collaborative robot, or cobot), where an operator involved in the industrial process may need to control the cobot.
Summary of the invention
According to a first example aspect herein, there is provided an extended reality, XR, device for a user controlling a robot comprising a robotic arm and an end effector.
Preferably the XR device comprises a visual device configured to display an extended reality, XR, content to said user. Preferably said XR content represents an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot. Preferably the XR device comprises an input module configured to receive a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot. Preferably the XR device comprises a communication module configured to transmit a signal for causing said at least a part of said robot to move substantially along said new trajectory. The visual device may be any suitable device (e.g. headset with integrated display) to be worn or otherwise used by the user) to visual an XR content.
By way of non-limiting example, the XR content may be any content displayed to a user, and including at least one 2D or 3D virtual element of digital information. The at least one virtual element may be superimposed onto a real environment, either captured or viewed live by the user, the at least one virtual element may be superimposed onto a virtual environment, or the at least one virtual element may represent a virtual environment. The XR content may be any of an augmented reality (AR) content, a mixed reality (MR) content, or a virtual reality (VR) content.
In cases where an AR content or an MR content is displayed, the displayed environment is a real environment, which may be captured by imaging means. The imaging means may be located on the visual device (to correspond to a point of view of the user using the visual device) or may be located near the robot and be oriented towards the robot to capture images of the robot. Instead of the imaging means, the visual device may comprise transparent lenses allowing the user to view the real environment. In any cases, the real environment may be shown to the user in real-time (or with negligible delay such as a few milliseconds between a capture of image and a display to the user).
The AR or MR content may be superimposed on an image of the environment (captured by the imaging means) or projected on the transparent lenses so that the user views the AR or MR content and the real environment.
In cases where a VR content is displayed, a virtual environment is displayed. The virtual environment may be generated by a computer model of a real environment drawn or measured (e.g. using a LIDAR or other point cloud technology).
The input module may be any suitable device held or otherwise used by the user, which allows the user to interact with the XR content. Specifically, a virtual pointer corresponding to the input module is displayed in the XR content. As the user moves the input module, the movement of the input module is captured (e.g. using sensors such as accelerometers in the input module and/or a system placed around the user to monitor movements of the input module), and the position of the virtual pointer moves according to the movement of the input module. The term manipulating should be understood to mean any interaction by the user with the virtual element representing the trajectory of at least a part of the robot, using any suitable input mechanism.
The planned trajectory indicates a pre-programmed trajectory that the robot would follow when operating unless the user modifies the trajectory. The planned trajectory may be a trajectory of an ongoing movement of the robot or a movement that has not yet initiated.
This trajectory may be for a specific part of the robot (e.g. a movement of a segment of the robotic arm, a rotation of a joint of the robotic arm, a movement, a rotation, or an operation of the end effector), or it may be a trajectory for multiple parts, or substantially the entire robotic arm (or entire robot).
By "substantially along the new trajectory", it is intended to mean that the trajectory taken by the at least a part of the robot may coincide with the trajectory defined by the user input as closely as possible, or it may involve some modifications to the trajectory defined by the user input, as will be explained in more detail below.
The communication module may transmit the signal via any suitable communication link, either directly to the robot or to another device that controls the movement of the robot.
In other words, the XR device displaying the XR content provides an XR interface for the user controlling the robot.
Thus, by allowing the user to visualise the planned trajectory of at least a part of the robot and allowing the user to manipulate the virtual element representing the planned trajectory, the user can control the robot in a convenient way. Thus, the XR device (and the XR interface provided by the XR device) improves user experience when controlling robots.
Preferably, said planned trajectory is a planned trajectory of at least one of said robotic arm and said end effector.
Preferably, said XR content further comprises a virtual representation of said at least a part of said robot, more preferably of said robot. Preferably said visual device is configured to display a movement of said virtual representation in accordance with at least one of said planned trajectory and said new trajectory.
Accordingly, the user may visualize how the robot would move (according to the planned trajectory or the new trajectory) before the robot begins moving, thus improving the user's experience.
Preferably, said input module is configured to receive a user input indicating a selection between two or more of: said XR content displaying a virtual representation of said robot entirely, said XR content displaying a a virtual representation of one or more parts of said robot selected by said user, and said XR content not displaying any virtual representation of said robot or part of said robot.
Preferably, said input module to receive a user input indicating a selection between said XR content displaying an animation of a virtual representation of at least a part of said robot, and said XR content displaying a still virtual representation of at least a part of said robot.
Preferably, said input module receives a second user input for triggering said communication module to transmit said signal.
Accordingly, the user may select a time when the at least a part of the robot should move in accordance with the defined new trajectory, thus allowing the user to modify the trajectory as desired.
Preferably, said input module is configured to receive said user input as the robot is moving along the planned trajectory, and said communication module transmits a signal that causes said at least a part of said robot to modify an ongoing movement.
Accordingly, the user may control ongoing operations of the robot in real-time, thus allowing the user to correct potential issues with the planned trajectory.
Preferably, said XR device further comprises a control module configured to cooperate with said input module to define said new trajectory in accordance with a set of predetermined constraints defined for at least one of said environment and said robot.
Accordingly, the trajectory defined by the user may be modified to meet the predetermined constraints, for example to avoid collision between the robot and elements of the environment, to avoid damage to the robot or to an object handled by the robot, etc. Accordingly, the user does not need to follow the predetermined constraints when defining a new trajectory (or check whether the new trajectory satisfies all predetermined constraints) as the control module makes this determination, which therefore improves user experience when defining the desired trajectory, whilst ensuring predetermined constraints are satisfied.
Preferably, said set of predetermined constraints comprises at least one of a physical constraint and an operational constraint.
The set may comprise any number (e.g. more than one) of physical constraint and any number than one operational constraint. The physical constraint(s) may depend on an operational limit of the robot, or the physical constraint(s) may depend on the physical environment of the robot.
Physical constraint(s) depending on an operational limit of the robot may include, for example, a limit to possible configurations of the robot (e.g. a limit to rotations or extension of parts of the robot). Physical constraint(s) depending on the physical environment of the robot may include one or more spaces (two or three dimensional) around the robot that the robot cannot enter (for example zones where obstacles are present or are likely to be present).
The operational constraint(s) depend on an operation that the robot is to perform, and may depend for example on an object to be processed by the end effector, such as a preferred orientation of the object when moved by the robotic arm (e.g. to prevent spillage or damage to the object), a restriction in movement in a particular direction when the object is held by the robotic arm, etc.
Accordingly, constraint(s) may be predetermined to ensure the operation of the robot does not cause damage to the object being processed, to the robot, or to the environment of the robot.
Preferably, said control module is configured to cooperate with said input module to modify said new trajectory, to reduce at least one of: an amount of energy required for said at least a part of said robot to reach an end point of said new trajectory, a risk of collision of at least one of said robot and an object manipulated by said robot with said environment, and an effect that a movement of said at least a part of said robot substantially along said new trajectory would have on an object manipulated by said robot. Accordingly, the new trajectory that is to be followed by the robot can be optimized, in terms of energy resources and/or risk reduction.
Preferably, said communication module is configured to communicate with a robot module on which said robot is placed. Preferably said robot module comprises a plurality of sensors, preferably configured to sense a space in a vicinity of said robot and said robot module for the presence of any object.
Accordingly, the XR device and the module may exchange information related to the current environment of the robot placed on the robot module. For example, the XR device may provide information to the robot module on the new trajectory of the robot, which may use this information to adapt the sensing of the space in the vicinity of the robot and the module.
Preferably, said communication module is configured to obtain data related to at least one of said robot and said space from said robot module.
The data obtained from the robot module may be indicating a fault occurring in the robot or in the robot module, a sensed object to be represented in the XR content, or may a potential collision with the sensed object.
Accordingly, the data obtained from the robot module may be used to present suitable information to the user as part of the XR content, and/or to modify the trajectory in accordance with the current environment of the robot.
Preferably, said visual device is configured to modify said virtual element in accordance with said received user input substantially contemporaneously with said user manipulating said virtual element.
By "substantially contemporaneously with", it is intended to mean that the modification is displayed simultaneously or with a negligible delay (e.g. a few milliseconds).
Accordingly, the manipulation performed by the user may be instantaneously shown to the user, thus allowing the user to better define the new trajectory.
Preferably, said virtual element comprises at least one of a curve and a number of points in a virtual three dimensional space.
For example the curve may be a line or a plurality of segments that may be linked.
The virtual element may also comprise a plurality of curves separate from each other (e.g. showing only parts of the trajectory) or a number (one or more) of points that should be reached by the part of the robot in a given order.
Accordingly, the user may easily visualize the trajectory and identify all or a set of points that the part of the robot should travel to.
Preferably, said user input modifies said virtual element by dragging, or by moving a part of said virtual element in said environment represented by said XR content.
The part may be a point or a section of a curve representing the trajectory.
The user may therefore selectively modify a part of the trajectory, whilst the rest of the trajectory is either substantially unchanged or is adapted to be in conformity with the modified part. Thus, the user may easily modify only a part of the trajectory.
Preferably, said XR device comprises means to alert said user of an issue related to at least one of said new trajectory and said robot.
For example, the issue that is alerted may indicate that the new trajectory should be rectified, for example if it fails to comply with a predetermined constraint, or the issue may indicate a technical issue with the robot that should be attended to by the user, for example to repair, replace or maintain the robot (or the robot module).
Preferably, said means to alert comprise at least one of audio, visual and haptic means.
For example, the visual device may be coupled to loudspeakers, earphones etc. to alert the user. The visual alert may be provided as part of the XR content, or (in case the XR content is AR or MR content, it may be a visual alert placed in a location that is likely to be in the field of view of the user (for example on the input module), so that the user viewing the XR content is likely to view the visual alert. The haptic means may be provided via the input means or any other part of the XR device in contact with the user (e.g. the headset worn by the user), and may generate a vibration to alert the user.
According to a second example aspect herein, there is provided a system comprising the XR device according to the first example aspect disclosed herein, and said robot.
Preferably, the system comprises said robot module.
According to a third example aspect herein, there is provided a method for a user controlling a robot comprising a robotic arm and an end effector. Preferably the method comprises displaying an extended reality, XR, content by means of a visual device, said XR content preferably representing an environment in which at least one of said user and said robot is located, and preferably comprising a virtual element representing a planned trajectory for at least a part of said robot.
Preferably the method comprises receiving a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
Preferably the method comprises transmitting a signal for causing said at least a part of said robot to move substantially along said new trajectory.
Preferably, said planned trajectory is a planned trajectory of at least one of said robotic arm and said end effector.
Preferably, said XR content further comprises a virtual representation of said at least a part of said robot. Preferably the method further includes displaying a movement of said virtual representation in accordance with at least one of said planned trajectory and said new trajectory.
Preferably, the method further comprises modifying said new trajectory in accordance with a set of predetermined constraints defined for at least one of said environment and said robot.
Preferably, said set of predetermined constraints comprises at least one of a physical constraint and an operational constraint.
Preferably, said modifying said new trajectory includes reducing at least one of: an amount of energy required for said at least a part of said robot to reach an end point of said new trajectory, a risk of collision of at least one of said robot and an object manipulated by said robot with said environment, and an effect that a movement of said at least a part of said robot along said new trajectory would have on an object manipulated by said robot.
According to a fourth example aspect herein, there is provided a computer program comprising instructions which, when executed by one or more processor, cause said one or more processor to carry the method according to the third example aspect disclosed herein.
According to a fifth example aspect herein, there is provided a method for a user controlling a robot comprising a robotic arm and an end effector, the method comprising causing a visual device to display an extended reality, XR, content to said user, said XR visual device, said XR content representing an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot; receiving a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot; transmitting a signal for causing said at least a part of said robot to move substantially along said new trajectory.
According to a sixth example aspect herein, there is provided a computer program comprising instructions which, when executed by one or more processor, cause said one or more processor to carry the method according to the fifth example aspect disclosed herein.
Brief Description of the Drawings
Embodiments of the present invention, which are presented for better understanding the inventive concepts, but which are not to be seen as limiting the invention, will now be described with reference to the figures in which:
Figure 1 shows a schematic view illustrating an example of a system in example embodiments;
Figure 2 shows a schematic diagram illustrating an example of an XR device in example embodiments;
Figure 3 shows a schematic view illustrating an example of an XR device and an XR content displayed to a user, in example embodiments;
Figure 4 shows a schematic diagram illustrating an example of an XR device in example embodiments;
Figures 5 to 8 show schematic views illustrating examples of XR content displayed to a user, in example embodiments;
Figures 9a and 9b show schematic views illustrating an example of a robot module coupled to a robot, in example embodiments;
Figure 10 shows a schematic diagram illustrating an example of a general kind of programmable processing apparatus that may be used to implement a control module in example embodiments; Figure 11 shows processing operations performed in example embodiments.
Detailed Description
Although example embodiments will be described below, it will be evident that various modifications may be made to these example embodiments without departing from the broader spirit and scope of the present disclosure. Accordingly, the following description and the accompanying drawings are to be regarded as illustrative rather than restrictive.
In the following description and in the accompanying figures, numerous details are set forth in order to provide an understanding of various example embodiments. However, it will be evident to those skilled in the art that embodiments may be practiced without these details.
Various elements described herein may communicate with each other using any number of suitable communication link. Each communication link may be, for example, a wireless communication link (for example a Wi-Fi, cellular telephone data link such as LTE/5G, Bluetooth or Bluetooth Low Energy (BLE)) or a wired communication link (e.g. DSL, fibre-optic cable, Ethernet etc.) Each communication link may not be permanent. For brevity, the following description will simply refer to elements transmitting or receiving signals, information, or data, which should be understood to be via a communication link established between these elements (either directly or via one or more intermediary elements acting as a relay).
Referring to Figure 1, an example of a system in example embodiments will now be described.
As shown on Figure 1, the system comprises an XR device 10, a robot 20 and a robot module 30.
The XR device 10 comprises a visual device 120 (shown on the example of Figure 1 in the form of a wearable headset), and an input module 110 (shown on the example of Figure 1 in the form of a remote controller with a button). The user 60 holds the input module 110, and views an XR content through the visual device 120.
By way of non-limiting example a communication module 130 may be co-located with the visual device 120 (e.g. in the wearable headset as shown on Figure 1). The robot 20 comprises at least a robotic arm 22 having a number of segments and an end effector 24.
By way of non-limiting examples, the end effector 24 may be used for a range of processes, where the process may be at least one of picking and placing objects (e.g. pick- n-placing), assembling products, additive manufacturing (e.g. 3D printing), visual detecting (e.g. detecting a set of points in an environment using imaging/sensing means and using the detected set of points to identify/optimise a trajectory of the end effector in the environment, or to determine how the end effector should contact an object corresponding to the set of points).
The robot module 30 is communicatively coupled to the robot 20 and comprises a plurality of sensors. The robot module 30 provides one or more functions such as detecting objects in a space in a vicinity of the robot 20 and controlling the robot 20 to avoid or minimise an effect of a collision with a detected object, or facilitating communication between the robot 20 and the XR device 10 by converting data to/from the robot into a data format that is compatible with the XR device 10.
The top part of Figure 1 shows an example of an XR content 40 that may be displayed to the user.
By way of non-limiting example, the XR content is a mixed reality (MR) content, which is captured using stereoscopic imaging means provided on the visual device 120 and oriented so as to coincide with the view of the user 60. The orientation of the visual device 120 is monitored using a monitoring system (not shown on Figure 1) detecting changes in movement of the visual device 120 due to movements of the head of the user 60.
The MR content includes one or more real elements being displayed, such as the robot 20 and the robot module 30, and a number of virtual elements being displayed, such as a planned trajectory 41 of the end effector 24, a number of points 42 along the planned trajectory, and a pointer 43 corresponding to the input module 110, the user 60 being able to control the position of the pointer 43 within the MR content by moving the input module 110.
Although Figure 1 shows a virtual elements for a planned trajectory of a single part of the robot, this is purely for simplicity. Instead, the XR content 40 may comprise virtual elements (e.g. curves or points) representing a planned trajectory for a plurality of parts of the robot, for example a trajectory for each rotatable joint of the robotic arm 22. Each of these virtual elements may be manipulable by the user.
Referring now to Figure 2, an example of an XR device in example embodiments will now be described.
In the example shown on Figure 2, the XR device 10 comprises an input module 110, a visual device 120, and a communication module 130. As explained above, each of these elements of the XR device 10 may be communicatively coupled to each other using any suitable communication link.
The input module 110 is controlled by the user as an input mechanism to provide instructions to the XR device 10.
The input module 110 comprises, or is coupled with, means for detecting a movement of the input module 110 that is imparted by the user holding the input module 110. For example, when the user moves the input module 110, this movement is detected (e.g. using sensors in the input module 110 or a separate monitoring system monitoring the position and orientation of the input module 110).
In addition, the input module 110 comprises a number of input means such as buttons, sliders, triggers etc. Each input means has a predetermined (or user configurable) associated operation having an effect on the XR content. For example, one of the input means (e.g. a button) may indicate a user selection, a second input means (e.g. a slider) may indicate a desired adjustment to a zoom level of the XR content being displayed, a third input means may indicate a desired switch between two different display mode (e.g. with different levels of information being displayed) etc.
When a user operation (e.g. a movement and/or an operation of one or more of the input means) is detected, a first signal indicating this user operation is transmitted to the visual device 120.
The visual device 120 displays an XR content to the user, using a display or by projecting virtual elements of XR content on optical elements that are located in the user's view. In particular, the visual device 120 displays a virtual pointer corresponding to the input module 110, and a virtual element representing a planned trajectory of at least a part of the robot 20 (e.g. of a part of the robotic arm, or of the end effector). Upon receiving a first signal, the visual device 120 determines whether the user operation indicated by the first signal requires a modification to the displayed XR content.
If so, the visual device 120 modifies the XR content in accordance with the user operation. For example, if the user operation indicates a movement of the input module 110, the visual device 120 causes a virtual pointer corresponding to the input module 110 to move in accordance with the movement of the input module 110. Additionally, if the first signal indicates that the user operated one of the input means, the visual device 120 can modify the displayed XR content in accordance with any user operation corresponding to the operated input means, for example by adjusting a zoom level, toggling between different modes of displaying information, etc.
The visual device 120 also determines whether the first signal indicates that a user manipulation (e.g. a click-and-drag operation) performed on the virtual element representing the planned trajectory of the robot.
If so, the visual device 120 modifies the virtual element in accordance with the user manipulation, and transmits a second signal to the communication module 130 indicating the user manipulation and corresponding change to the virtual element.
The communication module 130 comprises means for communicating with at least the robot 20. The communication module 130 may also communicate with an external device.
Upon receiving the second signal, the communication module 130 generates a third signal for causing the robot 20 to perform the desired change to the operation in accordance with the user manipulation. This third signal may be transmitted to the robot 20 directly, or it may be transmitted to an external device, separate from the robot 20 and the XR device 10, which in turn controls the robot 20.
It would be understood that signals between the input module 110, the visual device 120 and the communication module 130, or signals transmitted from the communication module 130 to cause an operation of the robot 20 may be transmitted recurringly.
By way of non-limiting example, the first signal may be generated recurringly, for example every millisecond, at least when a movement of the input module 110 or a user operation on an input means is detected, such that the visual device 120 receives the first signal and modifies the XR content as the user moves the input module 110 or operates the input means. Accordingly, the user may receive visual feedback of the current operation on the input module 110. Preferably, a delay between the user operation and the corresponding modification to the XR content is below or near a human perception threshold, such that the user feels the XR content changes simultaneously with the operation on the input module 110.
Referring now to Figure 3, an example of an XR device and an XR content displayed to a user will now be described.
The visual device 120 displays an XR content 40 to the user. The XR content 40 comprises real elements including the robot 20, an object 50 to be processed by the robot 20 and a working surface 51 on which the object 50 is to be moved. The XR content 40 also includes a number of virtual elements including a planned trajectory of the robot 20 (in this case, the planned trajectory of the end effector), when the robot 20 picks up the object 50 from an initial position to a final position (as indicated by element 50', which may be a virtual representation of the object 50.
However, it would be understood that any number of real elements or virtual elements may be displayed instead, and real elements may be omitted (for examples in case of a virtual content being displayed).
The XR content 40 also includes a virtual pointer 43 which moves as the user moves the input module 110.
As shown on Figure 3, when the user points the virtual pointer 43 to a point 42 on the planned trajectory, and drags point 42 (e.g. by clicking on a button on the input module 110 and changing the orientation of the input module 110), the planned trajectory 41 of the robot 20 is manipulated and a new trajectory 44 is defined.
Accordingly, at a desired subsequent time (for example when the user releases the button or when the user presses another button on the input means to indicate the manipulation is terminated), the robot 20 receives a signal instructing the object 50 to be picked-up and moved along trajectory 44 to the final position 50'.
Referring now to Figure 4, an example of an XR device in example embodiments will now be described. For brevity, elements of the XR device 11 that have already been described in connection with Figure 2 above (and identified with the same reference numbers on the
Figures) will not be referred to here.
The XR device 11 comprises a control module 140. The control module 140 transmits a signal to the visual device 120 indicating the XR content to be displayed to the user.
In the example of Figure 4, the input module 110 transmits the first signal to the control module 140, instead of the visual device 120.
Upon receiving the first signal, the control module 140 determines whether the first signal indicates that a user manipulation (e.g. a click-and-drag operation) performed on the virtual element representing the planned trajectory of the robot.
If the first signal does not indicate a user manipulation on the virtual element, the control module 140 determines a modification required to the displayed XR content and transmits a signal to the visual device 120 to cause the visual device 120 to display the modified XR content.
If the first signal does indicate a user manipulation on the virtual element, the control module 140 determines whether the new trajectory defined by the user manipulation satisfies a number of predetermined operational and physical constraints that are defined for the robot 20 and for the environment.
If the new trajectory satisfies the predetermined constraints, the control module 140 determines a modification required to the displayed XR content and transmits a signal to the visual device 120 to cause the visual device to display the modified XR content. Then, upon receiving another signal from the input module 110 indicating the user confirmed the displayed new trajectory (e.g. a second user input), the control module 140 transmits another signal to the communication module 130 to cause the communication module 130 to transmit the signal for causing the robot 20 to move along the new trajectory.
If on the other hand the new trajectory does not satisfy one or more of the predetermined constraints, the control module 140 either modifies the new trajectory to comply with the predetermined constraints or alerts the user that the robot 20 cannot proceed along the new trajectory, thus allowing the user to re-define a new trajectory. In other words, the control module 140 and the input module 110 cooperate to define a new trajectory in accordance with the predetermined constraints.
As explained in connection with Figure 1, the control module 140 may cause the visual device 120 to display an XR content comprising virtual elements (e.g. curves or points) representing a planned trajectory for a plurality of parts of the robot, for example a trajectory for each rotatable joint of the robotic arm 22. Each of these virtual elements may be manipulable by the user.
In such cases, a corresponding set of constraints may be predetermined for each part of the robot. For example, each joint may be rotatable by a different amount, or each extendable segment of the robotic arm may have a different load capacity.
Upon receiving a signal indicating a user manipulation on a virtual element representing one of the planned trajectories, the control module 140 may determine whether the indicated user manipulation satisfies the set of constraints predetermined for the corresponding part of the robot.
Referring now to Figure 5, an example of an XR content displayed to the user will be described.
The XR content 40 shown on Figure 5 includes the robot 20, and a virtual representation of the robot 20' (e.g. a virtual twin of the robot). In addition, the XR content 40 comprises a virtual element 41 representing a planned trajectory of the robot.
However, it would be understood that a virtual element representing the new trajectory defined by the user manipulation may be displayed on the XR content 40, instead of or in addition to the virtual element 41representing the planned trajectory.
The XR content 40 shows an animation of the virtual representation 20' of the robot 20 as it moves along the trajectory 41, thus allowing the user to visualize how the robot 20 would move when instructed.
However, it would be understood that, instead of a virtual representation 20' of the robot, the XR content 40 may show a virtual representation of only a part of the robot 20 (e.g. the part corresponding to the trajectory 41), which may be animated as the represented part moves along the trajectory 41.
For example, the user may select between the XR content displaying a virtual representation of the entire robot 20, a virtual representation of one or more parts of the robot 20 selected by the user (e.g. only the end effector 24, or the end effector 24 and a rotatable joint connected to the end effector 24), or not viewing any virtual representation of the robot 20. The user may also select between any virtual representation of a (part of) the robot 20 being animated or not. User selections may be performed, for example, via input means provided on the input module 110.
Referring now to Figure 6, an example of an XR content displayed to the user will be described.
The XR content 40 shown on Figure 6 includes the robot 20, and a virtual element 41 representing the planned trajectory of the robot.
In addition, the XR content includes a new trajectory 44 ending at a new final position 46, that the user defined by drawing the new desired trajectory 44 (i.e. not by dragging a part of the virtual element 41).
However, in this case, the final position 46 is beyond the operational reach of the robot 20 (in other words, it is out of reach of the robot 20). In other words, the control module 140 determines that the new trajectory does not comply with a predetermined physical constraint.
As a result, the control module 140 causes the visual device 120 to display a visual alert 47 which notifies the user of the issue.
Referring now to Figure 7, an example of an XR content displayed to the user will be described.
The XR content 40 shown on Figure 7 includes the robot 20, and a virtual element 44 representing the new trajectory of the robot 20 defined by user manipulation, ending at a final position 46.
In the present case, the control module 140 determines that the new trajectory represented by the virtual element 44 does not comply with an operational constraint, namely that an amount of energy required for the robot 20 to reach the final position 46 is not minimized (in other word, that the trajectory can be optimized in terms of energy requirements).
Accordingly, the control module 140 modifies the new trajectory to minimize the required energy, causes the visual device 120 to include in the XR content 40 a virtual element 48 representing the new trajectory modified to minimize the required energy. The control module 140 may then (either when the trajectory represented by the virtual element 48 is approved by the user or automatically), transmit a signal to the communication module 130 which in turn transmits a signal for causing the robot 20 to move along the trajectory represented by the virtual element 48. Because the modified new trajectory (represented by the virtual element 48) has the same end position 46 as the new trajectory defined by the user (represented by the virtual element 44), a movement of the robot 20 along one of the is considered to be substantially along the other of the trajectories.
Referring now to Figure 8, an example of an XR content displayed to the user will be described.
The XR content 40 shown on Figure 8 includes the robot 20, an object 50 to be picked-up by the robot 20, and an object 52 present in the environment of the robot.
Additionally, the XR content 40 comprises a virtual element 44 representing the new trajectory of the robot 20 defined by user manipulation.
However, in the present case, the trajectory represented by the virtual element 44 would cause the object 50 to collide with object 52. Thus, the control module 140 determines that the new trajectory represented by the virtual element 44 does not comply with a predetermined constraint, namely that a collision with the environment (and specifically objects in the environment) should be avoided.
Accordingly, the control module 140 modifies the new trajectory to avoid the collision, and causes the visual device 120 to include in the XR content 40 a virtual element 48 representing the new trajectory modified to avoid a collision.
As explained in connection with Figure 7 above, the robot 20 may then move the object 50 according to the new modified trajectory represented by the virtual element 48.
Referring now to Figures 9a and 9b, an example of a robot and a robot module that may be coupled to the robot will be described.
In the example shown on Figure 9a, the robot 20 includes a robotic arm 22, an end effector 24 and a base 26. The robot module 30 comprises a plurality of sensors 32, an interface unit 34. With the sensors 32, the robot module 30 may detect objects in a space in a vicinity of the robot 20 and controlling the robot 20 to avoid or minimise an effect of a collision with a detected object.
With the interface unit 34, the robot module 30 may facilitate communication between the robot 20 and the XR device 10 by converting data to/from the robot into a data format that is compatible with the XR device 10.
The interface unit 34 is located on a top surface of the robot module 30 and is mechanically coupled to the robot 20 via the base 26 of the robot, using for example fastening means such as bolts and thread throughs.
By way of non-limiting example, on Figure 1, the top surface of the robot module 30 has a substantially octagonal shape, and a bottom surface of the robot module 30 has a substantially square shape. The robot module 30 has eight side surfaces (four triangular and four trapezoidal) joining the top surface and the bottom surface. Between the top surface and the bottom surface (defining planes Pl and P2, respectively), the robot module 30 defines a taper which is narrower at the top surface than the bottom surface.
Each of these sides includes a respective sensor 32, although Figure 9a only shows four of these sensors.
Accordingly, each sensor 32 is facing a respective direction away from the robot module 30 (and the robot 20), and can therefore detect a space around the robot module 30 and the robot 20 for any object that should not collide with the robot 20.
Additionally, the robot module 30 has a tapered outer shape, which is narrower towards the top. In other words, a cross section along a horizonal plane of the robot module 30 has a smaller area towards the top of the robot module 30 than towards the bottom of the robot module 30. Accordingly, more components of the robot module 30 may be placed towards the bottom (i.e. towards plane P2), thus lowering the center of gravity of the robot module 30, which in turn improves its stability.
Each sensor 32 may comprises comprise one or more cameras (e.g. IR, RGB/visible or multispectral cameras), infrared, ultrasonic sensors, LIDARs or other suitable sensors. Preferably, the sensors 32 comprise a plurality of RGB cameras. Preferably, each sensor 32 comprises at least an RGB camera (i.e. a camera capturing visible light), and processing means for identifying object(s) on a captured image and for estimating a distance from the sensor 32 to each identified object. Based on this distance, the sensor 32 may determine whether each object is within a predetermined space in a vicinity of the robot and the module.
Referring now to Figure 9b, an example of the space in the vicinity of the robot 20 and the robot module 30 will now be described.
Figure 9b shows a top view of the robot 20 and the robot module 30. For illustrative purpose, only a part of the robot 20 is shown on Figure 9b.
By orienting each sensor 32 in a different direction, a space 36 in the vicinity of the robot 20 and the robot module 30 may be sensed for objects.
Referring now to Figure 10, an example of a general kind of programmable processing apparatus 70 that may be used in various components of the XR device 10 or 11 described herein, such as the input module 110, the visual device 120, the communication module 130, or the control module 140, is shown.
The programmable processing apparatus 70 comprises one or more processors 71, one or more input/output communication modules 72, one or more working memories 73, and one or more instruction stores 74 storing computer-readable instructions which can be executed by one or more processors 71 to perform the processing operations as described hereinafter.
An instruction store 74 is a non-transitory storage medium, which may comprise a non-volatile memory, for example in the form of a read-only-memory (ROM), a flash memory, a magnetic computer storage device (for example a hard disk) or an optical disk, which is pre-loaded with the computer-readable instructions. Alternatively, an instruction store 74 may comprise writeable memory, such as random access memory (RAM) and the computer-readable instructions can be input thereto from a computer program product, such as a non-transitory computer-readable storage medium 75 (for example an optical disk such as a CD-ROM, DVD-ROM, etc.) or a computer-readable signal 76 carrying the computer-readable instructions. The combination 77 of hardware components shown in Figure 3 and the computer-readable instructions are configured to implement the functionality of the control module 140.
In the description herein, operations caused when one or more processors 71 executes instructions stored in an instruction store 74 are described generally as operations performed by one of the input module 110, the visual device 120, the communication module 130, the control module 140, or by elements of the robot 20 or the robot module 30.
In summary, it will be appreciated from the description above that certain example embodiments perform processing operations to effect a method as shown in Figure 11 that is performed by an information processing apparatus (e.g. the XR device 10 or 11).
Referring to Figure 11, at step S82, the information processing apparatus displays an XR content by means of an extended reality, XR, visual device, said XR content representing an environment in which at least one of said user and said robot is located and comprising a virtual element representing a planned trajectory for at least a part of said robot. In other words, the information processing apparatus causes the XR visual device to display said XR content.
At step S84, the information processing apparatus receives a user input manipulating said virtual element to define a new trajectory for said at least a part of said robot.
At step S86, the information processing apparatus transmits a signal for causing said at least a part of said robot to move substantially along said new trajectory.
Modifications and variations
Many modifications and variations can be made to the example embodiments described above.
In examples described above, the robot 20 comprises one robotic arm with an end effect. However, this is non-limiting as the robotic arm may comprise more than one end effector. More generally, any suitable robot having at least one robotic arm may be used, such as robots comprising more than one robotic arm (each with one or more end effectors) and may comprise other elements.
In examples described above, the robot module 30 comprises a plurality of sensors 32. However, these may be omitted, for example if the robot module 30 does not provide a functionality of detecting objects in a vicinity of the robot 20. In examples described above, the robot is placed on a robot module 30. However, this is non-limiting as the robot module 30 may be omitted, and the robot 20 may be affixed to any other suitable support.
In examples described above, the displayed XR content 40 is a mixed reality content using an environment captured by imaging means on the visual device 120. However, it would be understood that the imaging means need not be located on the visual device 120 and may instead by at a predetermined location (e.g. on a wall, or mounted on a tripod) near the robot 20 and oriented towards the robot 20. Additionally, the imaging means may be omitted, for example if the visual device 120 comprises means such as transparent lenses for letting light from the environment through to the user, on which virtual elements may be projected to augment the environment viewed by the user.
In addition, it would be understood that the XR content 40 may instead be an augmented reality content or a virtual reality content.
In examples described above, the XR content 40 includes a number of virtual elements, for example the planned trajectory 41, the points 42 along the planned trajectory, the pointer 43, the virtual representation 50' of the object 50, the virtual representation 20' of the robot (or one or more parts of the robot). However, it would be understood that a user input may be received to selectively hide any of these virtual elements from being displayed on the XR content 40, for example via the input module 110. Additionally, the user may select which part(s) of the robot should have a virtual representation displayed on the XR content 40.
Additionally or alternatively, the user input received via the input module 110 may activate or interrupt any animation being displayed in the XR content 40, such as an animation of the virtual representation 20' of the robot 20 (or one or more parts of the robot selected by the user).
In examples described above, the planned trajectory is for a movement of at least a part of the robot 20 that has not yet initiated, and which initiates when the input module 110 receives a second user input indicating that the movement should begin.
However, this is non-limiting, as the at least a part of the robot 20 may already be moving along the planned trajectory. In such cases, the communication module 130 may transmit the signal causing the movement of the at least a part of the robot substantially along the new trajectory without waiting for a second user input to be received (i.e. without receiving a confirmation from the user that the new trajectory is approved).
In examples described above, the XR device comprises one input module 110. However, it would be understood that more than one input module may be provided instead. For example, two input modules 110 each held in one hand of the user 60. Alternatively, a larger number of input modules may be provided, and the user may select which to operate at any given time.
In examples described above, the input module 110 transmits a first signal to the visual device 120, which in turns transmits a second signal to the communication module 130 , or the input module 110 transmits the first signal to the control module 140. However, this is non-limiting as the input module 110 may instead transmits the first signal (e.g. broadcast or multicast the first signal) to the communication module 130 and/or the control module 140 as well as the visual device 120, or instead of the visual device. In that case, the visual device 120 receiving the first signal may merely adapt the XR content 40 being displayed. If the visual device does not receive the first signal, the communication module 130 or the control module 140 may transmit a separate signal to the visual device 120 indicating a required modification to the displayed XR content.
In examples described above, the input module 110 is used to manipulate a planned trajectory of at least a part of the robot 20.
In addition, the input module 110 may be operated (e.g. by pressing one or more buttons on the input module 110) to indicate a new desired operation of the robot 20 (i.e. other than manipulating a planned trajectory), such as instructing an end effector 24 having a suction cup applies a vacuum to grip an object, or to interrupt an applied vacuum to release an object, to cause a 3D printing head to begin heating and extruding a filament, etc. In that case, the first signal may be transmitted to the communication module 130 and/or the control module 140 directly, rather than to the visual device 120.
In examples described above, a visual alert is provided to the user via the XR content displayed by the visual device to indicate that the new trajectory desired by the user cannot be followed. However, it would be understood that a sound and or haptic alert indicating that the new trajectory cannot be followed may be used to notify the user, instead or in addition to the visual alert. Additionally, it would be understood that the visual, sound and/or haptic alert may, instead of indicating an issue related to a user input, indicate a fault or other problem of the robot, for example that a component of the robot requires a repair or that a movement of the robot is blocked, and that the user should investigate.
The example aspects described here avoid limitations, specifically rooted in computer technology, relating to the control of robots having at least one robotic arm. By virtue of the example aspects described herein, the user experience when controlling robots may be improved. Also, by virtue of the foregoing capabilities of the example aspects described herein, which are rooted in computer technology, the example aspects described herein improve computers and computer processing/functionality, and also improve the field(s) of at least robot control and data processing.
In the foregoing description, example aspects are described with reference to several example embodiments. Accordingly, the specification should be regarded as illustrative, rather than restrictive. Similarly, the figures illustrated in the drawings, which highlight the functionality and advantages of the example embodiments, are presented for example purposes only. The architecture of the example embodiments is sufficiently flexible and configurable, such that it may be utilized in ways other than those shown in the accompanying figures.
Software embodiments of the examples presented herein may be provided as, a computer program, or software, such as one or more programs having instructions or sequences of instructions, included or stored in an article of manufacture such as a machine-accessible or machine-readable medium, an instruction store, or computer- readable storage device, each of which can be non-transitory, in one example embodiment. The program or instructions on the non-transitory machine-accessible medium, machine-readable medium, instruction store, or computer-readable storage device, may be used to program a computer system or other electronic device. The machine- or computer-readable medium, instruction store, and storage device may include, but are not limited to, floppy diskettes, optical disks, and magneto-optical disks or other types of media/machine-readable medium/instruction store/storage device suitable for storing or transmitting electronic instructions. The techniques described herein are not limited to any particular software configuration. They may find applicability in any computing or processing environment. The terms "computer-readable", "machine- accessible medium", "machine-readable medium", "instruction store", and "computer- readable storage device" used herein shall include any medium that is capable of storing, encoding, or transmitting instructions or a sequence of instructions for execution by the machine, computer, or computer processor and that causes the machine/computer/computer processor to perform any one of the methods described herein. Furthermore, it is common in the art to speak of software, in one form or another (e.g., program, procedure, process, application, module, unit, logic, and so on), as taking an action or causing a result. Such expressions are merely a shorthand way of stating that the execution of the software by a processing system causes the processor to perform an action to produce a result.
Some embodiments may also be implemented by the preparation of applicationspecific integrated circuits, field-programmable gate arrays, or by interconnecting an appropriate network of conventional component circuits.
Some embodiments include a computer program product. The computer program product may be a storage medium or media, instruction store(s), or storage device(s), having instructions stored thereon or therein which can be used to control, or cause, a computer or computer processor to perform a ny of the procedures of the example embodiments described herein. The storage medium/instruction store/storage device may include, by example and without limitation, an optical disc, a ROM, a RAM, an EPROM, an EEPROM, a DRAM, a VRAM, a flash memory, a flash card, a magnetic card, an optical card, nano systems, a molecular memory integrated circuit, a RAID, remote data storage/archive/warehousing, and/or any other type of device suitable for storing instructions and/or data.
Stored on any one of the computer-readable medium or media, instruction store(s), or storage device(s), some implementations include software for controlling both the hardware of the system and for enabling the system or microprocessor to interact with a human user or other mechanism utilizing the results of the example embodiments described herein. Such software may include without limitation device drivers, operating systems, and user applications. Ultimately, such computer-readable media or storage device(s) further include software for performing example aspects of the invention, as described above.
Included in the programming and/or software of the system are software modules for implementing the procedures described herein. In some example embodiments herein, a module includes software, although in other example embodiments herein, a module includes hardware, or a combination of hardware and software.
While various example embodiments of the present invention have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art(s) that various changes in form and detail can be made therein. Thus, the present invention should not be limited by any of the above described example embodiments, but should be defined only in accordance with the following claims and their equivalents.
Further, the purpose of the Abstract is to enable the Patent Office and the public generally, and especially the scientists, engineers and practitioners in the art who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The Abstract is not intended to be limiting as to the scope of the example embodiments presented herein in any way. It is also to be understood that any procedures recited in the claims need not be performed in the order presented.
While this specification contains many specific embodiment details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments described herein. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Having now described some illustrative embodiments and embodiments, it is apparent that the foregoing is illustrative and not limiting, having been presented by way of example. In particular, although many of the examples presented herein involve specific combinations of apparatus or software elements, those elements may be combined in other ways to accomplish the same objectives. Acts, elements and features discussed only in connection with one embodiment are not intended to be excluded from a similar role in other embodiments or embodiments.
The apparatuses described herein may be embodied in other specific forms without departing from the characteristics thereof. The foregoing embodiments are illustrative rather than limiting of the described systems and methods. Scope of the apparatuses described herein is thus indicated by the appended claims, rather than the foregoing description, and changes that come within the meaning and range of equivalence of the claims are embraced therein.

Claims

1. An extended reality, XR, device (10; 11) for a user (60) controlling a robot (20) comprising a robotic arm (22) and an end effector (24), said XR device (10, 11) comprising: a visual device (120) configured to display an extended reality, XR, content (40) to said user (60), said XR content (40) representing an environment (50; 51; 52) in which at least one of said user (60) and said robot (20) is located and comprising a virtual element representing a planned trajectory (41) for at least a part of said robot (20); an input module (110) configured to receive a user input manipulating said virtual element to define a new trajectory (44) for said at least a part of said robot (20); and a communication module (130) configured to transmit a signal for causing said at least a part of said robot (20) to move substantially along said new trajectory (44).
2. The XR device (10; 11) according to claim 1, wherein said XR content (40) further comprises a virtual representation (20') of said at least a part of said robot, and said visual device (120) is configured to display a movement of said virtual representation (20') in accordance with at least one of said planned trajectory (41) and said new trajectory (44).
3. The XR device (11) according to claim 1 or claim 2, wherein said XR device (11) further comprises a control module (140) configured to cooperate with said input module (110) to define said new trajectory (44) in accordance with a set of predetermined constraints defined for at least one of said environment (50; 51; 52) and said robot (20).
4. The XR device (10) according to claim 3, wherein said control module (140) is configured to cooperate with said input module (110) to modify said new trajectory (44), to reduce at least one of: an amount of energy required for said at least a part of said robot (20) to reach an end point of said new trajectory (44), a risk of collision of at least one of said robot (20) and an object (50) manipulated by said robot (20) with said environment (52), and an effect that a movement of said at least a part of said robot (20) substantially along said new trajectory (44) would have on an object (50) manipulated by said robot (20).
5. The XR device (10) according to any one of claims 1 to 4, wherein said communication module (130) is configured to communicate with a robot module (30) on which said robot (20) is placed, said robot module (30) comprising a plurality of sensors (32) configured to sense a space (36) in a vicinity of said robot (20) and said robot module (30) for the presence of any object.
6. The XR device (10) according to claim 5, wherein said communication module (130) is configured to obtain data related to at least one of said robot (20) and said space (36) from said robot module (30).
7. The XR device (10) according to any one of claims 1 to 6, wherein said visual device (120) is configured to modify said virtual element in accordance with said received user input substantially contemporaneously with said user (60) manipulating said virtual element.
8. The XR device (10) according to any one of claims 1 to 7, wherein said virtual element comprises at least one of a curve (41) and a number of points (42) in a virtual three dimensional space.
9. The XR device (10) according to any one of claims 1 to 8, wherein said user input modifies said virtual element by dragging, or by moving a part of said virtual element in said environment represented by said XR content (40).
10. The XR device (10) according to any one of claims 1 to 9, wherein said XR interface comprises means to alert (47) said user (60) of an issue related to at least one of said new trajectory (44) and said robot (20).
11. The XR device (10) according to claim 10, wherein said means to alert comprise at least one of audio, visual and haptic means.
12. A system comprising said XR device (10) according to any of claims 1 to 11 and said robot (20).
13. A method for a user (60) controlling a robot (20) comprising a robotic arm (22) and an end effector (24), the method comprising: displaying an extended reality, XR, content (40) by means of a visual device (120), said XR content (40) representing an environment (50; 51; 52) in which at least one of said user (60) and said robot (20) is located and comprising a virtual element representing a planned trajectory (41) for at least a part of said robot (20); receiving a user input manipulating said virtual element to define a new trajectory (44) for said at least a part of said robot (20); transmitting a signal for causing said at least a part of said robot (20) to move substantially along said new trajectory (44).
14. The method according to claim 13, wherein said planned trajectory (41) is a planned trajectory of at least one of said robotic arm (22) and said end effector (24).
15. The method according to claim 13 or claim 14, wherein said XR content (40) further comprises a virtual representation (20') of said at least a part of said robot, and the method further includes displaying a movement of said virtual representation (20') in accordance with at least one of said planned trajectory (41) and said new trajectory (44).
16. The method according to any one of claim 13 to 15, further comprising modifying said new trajectory (44) in accordance with a set of predetermined constraints defined for at least one of said environment (50; 51; 52) and said robot (20).
17. The method according to claim 16, wherein said set of predetermined constraints comprises at least one of a physical constraint and an operational constraint.
18. The method according to claim 16 or claim 17, wherein said modifying said new trajectory (44) includes reducing at least one of: an amount of energy required for said at least a part of said robot (20) to reach an end point of said new trajectory (44), a risk of collision of at least one of said robot (20) and an object (50) manipulated by said robot (20) with said environment (52), and an effect that a movement of said at least a part of said robot (20) along said new trajectory (44) would have on an object (50) manipulated by said robot (20).
19. A computer program comprising instructions which, when executed by one or more processor, cause said one or more processor to carry a method comprising: causing a visual device (120) to display an extended reality, XR, content (40), said visual device (120) being used by a user (60) controlling a robot (20) comprising a robotic arm (22) and an end effector (24), said XR content (40) representing an environment (50; 51; 52) in which at least one of said user (60) and said robot (20) is located and comprising a virtual element representing a planned trajectory (41) for at least a part of said robot (20); receiving a user input manipulating said virtual element to define a new trajectory (44) for said at least a part of said robot (20); transmitting a signal for causing said at least a part of said robot (20) to move substantially along said new trajectory (44).
PCT/EP2024/059089 2023-04-05 2024-04-03 Extended reality interface for robotic arm Ceased WO2024208919A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP24717173.9A EP4688341A1 (en) 2023-04-05 2024-04-03 Extended reality interface for robotic arm

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT102023000006741 2023-04-05
IT102023000006741A IT202300006741A1 (en) 2023-04-05 2023-04-05 Extended Reality Interface for Robotic Arm

Publications (1)

Publication Number Publication Date
WO2024208919A1 true WO2024208919A1 (en) 2024-10-10

Family

ID=86732780

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/059089 Ceased WO2024208919A1 (en) 2023-04-05 2024-04-03 Extended reality interface for robotic arm

Country Status (3)

Country Link
EP (1) EP4688341A1 (en)
IT (1) IT202300006741A1 (en)
WO (1) WO2024208919A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140277737A1 (en) * 2013-03-18 2014-09-18 Kabushiki Kaisha Yaskawa Denki Robot device and method for manufacturing processing object
US20210170603A1 (en) * 2018-04-19 2021-06-10 Yuanda Robotics Gmbh Method for using a multi-link actuated mechanism, preferably a robot, particularly preferably an articulated robot, by a user by means of a mobile display apparatus
WO2022161637A1 (en) * 2021-02-01 2022-08-04 Abb Schweiz Ag Visualization of a robot motion path and its use in robot path planning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140277737A1 (en) * 2013-03-18 2014-09-18 Kabushiki Kaisha Yaskawa Denki Robot device and method for manufacturing processing object
US20210170603A1 (en) * 2018-04-19 2021-06-10 Yuanda Robotics Gmbh Method for using a multi-link actuated mechanism, preferably a robot, particularly preferably an articulated robot, by a user by means of a mobile display apparatus
WO2022161637A1 (en) * 2021-02-01 2022-08-04 Abb Schweiz Ag Visualization of a robot motion path and its use in robot path planning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
REINHART G ET AL: "A programming system for robot-based remote-laser-welding with conventional optics", CIRP ANNALS, ELSEVIER BV, NL, CH, FR, vol. 57, no. 1, 1 January 2008 (2008-01-01), pages 37 - 40, XP022674597, ISSN: 0007-8506, [retrieved on 20080509], DOI: 10.1016/J.CIRP.2008.03.120 *

Also Published As

Publication number Publication date
EP4688341A1 (en) 2026-02-11
IT202300006741A1 (en) 2024-10-05

Similar Documents

Publication Publication Date Title
US9987744B2 (en) Generating a grasp pose for grasping of an object by a grasping end effector of a robot
US10105843B2 (en) Robot device, remote control method of robot device, and program
CN110888428B (en) Mobile robot, remote terminal, computer readable medium, control system, control method
US8965579B2 (en) Interfacing with a mobile telepresence robot
KR102859547B1 (en) Robot system and Control method of the same
US20150273689A1 (en) Robot control device, robot, robotic system, teaching method, and program
CN114728413A (en) Method and system for controlling graphical user interface of remote robot
CN121411450A (en) System and method for initializing a robot to autonomously travel along a training route
CN112638593A (en) Augmented reality visualization techniques for robotic pick-up systems
CN114341930B (en) Image processing device, camera, robot, and robot system
US20250249589A1 (en) Systems and methods for teleoperated robot
JP7517803B2 (en) ROBOT TEACHING SYSTEM, IMAGE GENERATION METHOD, AND PROGRAM
US20140285633A1 (en) Robotic system and image display device
KR20130072748A (en) Device and method foruser interaction
US11992960B2 (en) Robot system, control apparatus of robot system, control method of robot system, imaging apparatus, and storage medium
JPWO2022124398A5 (en)
EP3946825A1 (en) Method and control arrangement for determining a relation between a robot coordinate system and a movable apparatus coordinate system
JP2026503080A (en) Method and system for calibrating a camera
JP6842441B2 (en) Programs and information processing equipment
US11407117B1 (en) Robot centered augmented reality system
WO2024208919A1 (en) Extended reality interface for robotic arm
JP6657858B2 (en) Robot operation system
KR101975556B1 (en) Apparatus of controlling observation view of robot
EP3599539B1 (en) Rendering objects in virtual views
US12095964B2 (en) Information processing apparatus, information processing method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24717173

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024717173

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2024717173

Country of ref document: EP

Effective date: 20251105

ENP Entry into the national phase

Ref document number: 2024717173

Country of ref document: EP

Effective date: 20251105

ENP Entry into the national phase

Ref document number: 2024717173

Country of ref document: EP

Effective date: 20251105

ENP Entry into the national phase

Ref document number: 2024717173

Country of ref document: EP

Effective date: 20251105

ENP Entry into the national phase

Ref document number: 2024717173

Country of ref document: EP

Effective date: 20251105