1. Introduction
Despite some recent developments in the area of robotics within nuclear environments, the industry has typically lagged a few years behind many others in terms of uptake. Historically, tele-operational control has been used within such radiologically active environments, with devices such as master slave manipulators first used in 1948 [
1]. Due to several factors, such as the desire to protect job security [
2] and the fear of an automated robotic platform suffering a significantly damaging fault, tele-operation has remained in favour until relatively recently. Tele-operated and hydraulically actuated robots such as the CEA developed MAESTRO [
3,
4] for example, have been used to demonstrate the concept of using radiation hardened platforms within radiologically active environments. The MAESTRO is a serial robot composed of six links and six hydraulic rotational joints (fed by a flow control servo-valve), with joint angular position measured with resolver sensors. Other commercial off the shelf (COTS) solutions, such as the tele-operated BROKK series of hydraulic manipulators, have been used to aid Russian decommissioning efforts [
5] for example. Furthermore, the tele-operated DECON robot at the Windscale Advanced Gas-cooled Reactor (WAGR) has been used for operations in order to significantly reduce dose to manual workers. Using solely human labour within the experimental reactor would have resulted in estimated doses of 1 Sv/Hr, indicating that annual 20 mSv legal dose limits would have been reached in a little over a minute. However, use of the relatively crude manipulator resulted in a maximum dose to manual labour of only 17 mSv over the course of the six years of the project [
6].
There is a large body of research in this general area concerned with hydraulic manipulator control, the effective gripping of pipework and efficient path planning. Koivumaki et al. [
7], for example, have looked into the development of energy-efficiency improvements in control, using a high-precision closed-loop controller with highly nonlinear hydraulic robotic manipulators, and using a heavy duty (475 kg payload) 3 DoF hydraulic arm as a test bed. Lin et al. [
8] have utilized rapid prototyping techniques in order to develop Shape Memory Alloy (SMA) fingers to improve robotic gripping, specifically for underwater use. The work has been further improved by the direct control of these grippers using electromyography (EMG). A multi-objective genetic algorithm (MOGA) was developed and applied by Borboni et al. [
9] for a serial robot designed to enable laser pipe cutting. The authors inserted a redundant DoF in order to allow infinite inverse kinematic solutions and thus increase the ease of determining possible cutting paths. The MOGA was adopted to reduce vibration effects, allowing for relatively smooth cutting. In the area of practical automated cutting, Alatorre et al. [
10] considered the use of sound in order to determine the optimal amount of force required to cut objects, including pipework. They report consistent force application to within 0.08
N using this technique. Assuming pipework is detached, Diep et al. [
11] considered a cutting system designed to operate on a rotating pipe, as their findings indicated this was an improvement compared to a fixed pipe. Further, Pardi et al. [
12] have addressed the problem of constrained motion planning using a 3 D vision system.
There is also a body of work related to cutting using purely tele-operational control, such as Chen et al. [
13] who report on remote cutting of cooling pipes within the China Spallation Neutron Source (CSNS) using a tele operated hydraulic cutter. Another example is Hafenkamp et al. [
14] who have developed a tele operated electrical/pneumatic underwater climbing robot to be used for cutting oil and gas pipelines. There is also a substantial body of more theoretically derived work. For example, Liu et al. [
15] present a trajectory and velocity planning method for a robot to machine a spherical single Y-groove in a spherical pipe. Others have looked at pipe cutting robots in a holistic sense considering aspects such as economic feasibility. An example is Han et al. [
16] who have looked at the economic feasibility of robot development and verification for the sustainable utilization of a steel pipe pile head cutting robot in South Korea. Interest in robotic pipe cutting is obviously not confined to the academic world, with a number of pipe cutting system patents also filed e.g., Ogawa et al. [
17].
However, some form of semi-autonomy, in which high-level control is determined by the operator and low-level control is determined automatically, is becoming more popular [
18,
19,
20]. Marturi et al. [
19], for example, have investigated the benefit of adding a level of autonomy during manipulation tasks: direct tele-operation is compared with a visual guided semi-autonomous system in a block stacking task. The semi-autonomous system is reported to have performed better than direct tele-operation, improving task completion time, precision, and repeatability. This general move towards a more autonomous outlook has occurred for several reasons, such as the excessive and unrealistic need for numerous skilled operators to directly control robots. Anticipated benefits are not limited to avoiding human exposure to radiation, but also include the improvement of general safety through the removal of workers from potential conventional accidents, increased productivity during decommissioning activity, and reducing the number of repetitive tasks leading to mistakes [
21]. Semi-autonomy in environments such as this has been a subject of interest for some time. Talha et al. [
22] have investigated current teleoperated systems, with multiple buttons and joysticks and with multiple cameras in order to provide different workspace views. Bruemmer et al. [
23] focus on a platform used for characterisation in nuclear environments, with a mixed-initiative control scheme developed to improve performance over direct teleoperation. The availability of cheap and mass-produced RGBD vision systems has also contributed to the willingness to adopt semi-autonomy into the industry, with the autonomous determination of object position and orientation allowing the tracking of objects using techniques such as Visual Simultaneous Localisation and Mapping (VSLAM). This in turn enables the automation of many tasks which may have otherwise been undertaken by people.
Radiation tolerance within electrically actuated platforms is a significant issue with a decrease in performance caused by several mechanisms. These can include the loss of insulation in the motor coils or connection wires, the embrittlement of the connections generally, lubricant hardening within the bearings or gearbox, or maybe even degradation in the electronics [
24]. Nancekieval [
25] considers some individual components commonly used in robotics and concludes that weak points may be voltage regulators, which typically cease operations at an equivalent dose of 5 kGy, or microcontrollers, which typically experience breakdown at 1.2 kGy. In [
24] it is mentioned that the dose rate at the surface of a vitrified high level waste container may be as much as 10 kGy/h. If this figure is used as a worst-case scenario, this corresponds to a time limit of less than half an hour before likely breakdown in the robot platform i.e., if a voltage regulator is involved in the platform. Hydraulic actuators are generally more radiation tolerant, although the varying components typically used means this changes greatly from platform to platform. For example, the Gamma 7 F master slave manipulator is noted to have a radiation tolerance of over 100,000 Gy [
26], whereas Nieminen et al. [
27] claim that a tolerance of 100 Gy/hr and a compound tolerance of 1 MGy may be required for the hydraulic manipulator in use at ITER. Generally, the purely hydraulic part of such manipulators are effected relatively little by radiation, with the small non-hydraulic parts typically the weakness.
Hydraulically actuated manipulators do, however, present challenges in comparison with electrically operated ones, with the main issue being the reduced positional precision attainable. Whilst differences in manipulator size and dynamic rates make like for like comparison of precision difficult, hydraulic manipulators can typically achieve positional precision in the order of millimetres, compared to the micrometre precision attainable with some electrically actuated arms [
28]. This is chiefly due to two characteristics of the hydraulic actuation system and the difficulties associated with control. Firstly, the joints exhibit a variable dead-zone with respect to the voltage applied to the hydraulic valves, making the exact behaviour of the arms difficult to predict and control complex when small or slow movements are required [
29,
30]. The second issue is potential non-linearities in the actuator response caused by friction, particularly when operating around and passing the zero-velocity point. Both of these concerns can be mitigated somewhat with advanced modelling [
31] and control methods [
30], although it is still difficult to control hydraulic arms to the same level of precision as electrical ones.
Within this context, the present article describes experimental investigations into the practical implementation of a system for semi-autonomous cutting of pipework, using a computer controlled and hydraulically actuated dual-manipulator system. The development and preliminary evaluation of the vision system has been reported previously in this journal [
32] although that work focused on the low-level control problem, with experimental results limited to positioning and grasping tasks only. An early version of the vision system used is also described by West et al. [
33], again for positioning tasks only. By contrast, the present work represents the first time this hydraulically actuated system has been adapted to perform real cutting operations, thus bringing the Technology Readiness Level (TRL) up to a value of around six (TRL being an industry standard scale of technological maturity from one to nine [
34]). The main contributions relate to the lessons learnt from this new experimental work, an evaluation of the positional accuracy of the approach, and a comparison with manual teleoperation and cutting. Three different pipe materials are cut here, namely cardboard, ABS plastic and aluminium. Hence, this new article describes the latest vision and control software, kinematics, and the reconfigured bespoke hardware arrangement, including an upgraded hydraulic system and the installation of an off-the-shelf reciprocating saw.
The system combines four elements: (i) a low cost RGBD sensor to acquire positional data; (ii) the MATLAB software environment (Mathworks, Natick, MA, USA) running on a standard windows PC; (iii) LabVIEW (National Instruments, Austin, TX, USA) running on a small form factor specialist computer; and (iv) two hydraulically actuated manipulator arms (Hydrolek-7W, Hydrolek, Fareham, UK). The vision data, acquired with the aid of a human operator, are processed within the MATLAB environment, i.e., pixel and depth data are converted into the required cartesian gripper and saw tool positions using simple geometry. These are transferred to the small form factor computer via the User Datagram Protocol (UDP), which in turn performs the required kinematic calculations within the LabVIEW environment. National Instruments Compact FieldPoints (CFPs) are subsequently used to transfer the joint position data to the two hydraulically actuated manipulator arms, which further have potentiometers onboard to provide feedback.
The vision system is briefly described in
Section 2, with the Hydrolek manipulators and kinematics described in
Section 3.
Section 4 presents the experimental methodology utilized, followed in
Section 5 by the results for the laboratory validation of the new semi-autonomous system for pipe cutting. Finally, the conclusions are presented in
Section 6.
2. Vision System
Vision systems are necessarily electrical and so may not exhibit the same radiation hardness enjoyed by the hydraulic actuation system, which was purposely chosen initially for its radiological toughness. There have been several studies of the radiation effects on COTS cameras with, for example, Wang et al. [
35] concluding that the effects of as little as 20 Gy/Hr are evident on images created using a typical CMOS sensor. This could mean regular replacement of the camera system, and thus a cheap and widespread vision solution may be preferable to a more advanced but expensive one. Indeed, the accuracy offered by the hydraulic arms is relatively poor and thus the accuracy of the vision system is unlikely to be the limiting factor. In this theme, a purposefully low quality and cost effective sensor has been selected for R&D purposes, namely a Microsoft Kinect v1.
The MATLAB algorithm used here has evolved from earlier research by the present authors and is described in more detail in [
32,
33]. The operator elects when to take a snapshot of the Kinect (Microsoft, Redmond, WA, USA) RGB camera field of view, before the algorithm determines the major interfaces between the colours in the image via Canny edge detection [
36,
37], with the thresholds then selected and altered by the operator. The object of interest is selected by clicking on it, whereupon the regionprops function enables the determination of gripping and cutting positions. The interface for performing this action is shown in
Figure 1. Simple trigonometry is used to produce six pieces of information which are sent via UDP data packets to the small form computer running the LABVIEW software: X
gripper, Y
gripper, Z
gripper, X
cutter, Y
cutter and Z
cutter. This is transmitted utilising a straightforward user created function featuring seven user definable parameters, with the format:
udp_comm(‘MODE, ARM, p1, p2, p3, p4, p5’)
If the number ‘0’ is used for the MODE parameter, then the joint positions can be specified directly via parameter values p1 to p5. However, if the MODE parameter is set to ‘1’ then parameters p1 to p3 define the X, Y and Z end effector coordinates, with p4 and p5 ignored. Further, the ARM parameter allows the operator to choose which manipulator the command refers to (‘0’ for left, ‘1’ for right).
The LabVIEW software environment on the second computer provides a further Graphical User Interface (GUI) with the bespoke interface developed by the authors allowing for both modes. Mode 0 is the default and allows the user to define the required joint angles by simple entry into data boxes within the front end of the software. These values are used with the in-built, user-tuneable PID control system to set the angles of the manipulator joints according to feedback from the built-in rotary potentiometers. Mode 1 allows the user to set a desired end effector position either by directly entering x, y, and z cartesian co-ordinates into the LabVIEW GUI, or by using the data received via the UDP Network interface from the MATLAB algorithm.
3. Hydrolek HLK-7 W Manipulators
Two Hydrolek HLK-7 W manipulators are used here, each consisting of a 6-degree-of-freedom articulated arm as illustrated in
Figure 2 and
Figure 3, with corresponding Denevit Hartenberg parameters shown in
Table 1. There is further an option of a gripper as a seventh actuator on either manipulator. The kinematic configurations of the two arms are nearly identical, with the only difference being the shoulder pitch joint, which has a 3.1 cm offset towards the positive
y-axis for the right arm and negative
y-axis for the left arm. For the present research, the right arm features seven actuators (q
1 to q
6, plus an end-effector gripper) whereas the left arm has only five (q
1 to q
5), since the gripper assembly has been used to instead mount the reciprocating saw.
The missing wrist rotation (q
6) results in the inability to solve the inverse kinematics for any orientation of the end-effector. A Moore Penrose pseudoinverse algorithm [
38,
39] is used instead to make an approximation to the inversion. The pseudoinverse algorithm is based on the singular value decomposition method from linear algebra, with a version also available within both the MATLAB and LabVIEW environments.
In contrast to previous research using the same manipulators [
32], the hydraulic system has been recently upgraded with a model MKPTO415 V16 V15 Pressure & Tank Circuit Hydraulic Power Unit (Bosch Rexroth, Lohr, Germany) and a pressure pump with a three-phase AC 4 pole electric motor. The unit provides a flow rate of 5.5 L/min at 220 bar and boasts a 15-litre oil tank. This upgrade was primarily undertaken to address shortfalls in the previous system and improve efficiency and maintainability.
The hydraulic flow to the manipulator joints is controlled via a series of solenoid actuated valves arranged in a manifold. The solenoids are in turn controlled via the National Instruments (NI) Compact DAQ 9132 system (cDAQ); a 1.33 GHz dual-core atom computer with 4 slots for I/O modules. The system runs both Windows 7 Embedded Edition and LabVIEW 2018 for programming and interfacing with external peripherals. The cDAQ 9132 currently utilises an NI 9205 module (a 32-channel analogue-to-digital converter (ADC)); and two NI 9264 modules (16-channel digital-to-analogue converters (DAC)).
The cDAQ 9132 is powered by an adjustable 24 V NI PS-15 Power Supply Unit (PSU) which also provides 24 V DC to the NI 9205 and NI 9264 modules. The two NI 9264 modules are used to actuate the P02 AD1 valves in the two Hydrolek manipulators. Each joint requires connection to two directional valves and, therefore, 10 analogue outputs from one of the NI 9294 modules are required to operate the left arm with its five working joints and the right arm with its seven joints (with a gripper). Angle position sensors used in this configuration are simple rotary linear potentiometers requiring 10 V to operate. A dedicated box holding all of these elements has been mounted onto the stand where the two arms are located. A monitor, mouse and keyboard are externally connected such that an operator can control or program the robot from outside the robotic cell.
The left-hand manipulator has been fitted with a RS890 K 500 W Scorpion Powered Hand Saw (Black and Decker, Slough, UK) as shown in
Figure 4. This tool is mounted over the top of the gripper using a metal plate, which precludes use of the left-hand gripper for manipulation tasks. Rotation of this wrist joint is prevented via the use of a further aluminium shank across the joint. The saw provides a stroke length of 23 mm, a speed of 2700 strokes per minute, and is equipped with a variety of blades—a wood blade attachment being used for the cardboard pipe, and an 18 teeth per inch metal blade used for the plastic and metal pipes.
5. Results and Discussion
Initially, both the position calculated with the kinematic techniques and the physically measured position of the gripping end effector were compared to that requested by the user in the controlling LabVIEW software. Here a location of (0.7, −0.2, 0.35) was requested ten times, hence the repeatability of the system could be determined. The iterative inverse Jacobian kinematic algorithm determined the joint angles, which when fed back through the corresponding forward kinematic algorithm, yielded calculated positions that were within a few cm of these desired positions over the ten runs of this experiment. Using a simple Pythagorean algorithm, the average distance between the desired positions and those taken up according to the forward kinematic calculations was 2.73 cm over five sets of readings. Another way to look at these results, is that the average predicted position taken up by the end effector was (0.704, −0.185, 0.340)—a distance of 1.82 cm from the desired position. The corresponding physically measured physical position of the end effector were also determined using a tape measure, and over the same five readings was found to be at an average position of (0.725, −0.237, 0.330). This represents a distance of 4.89 cm between the actual position of the end effector and the value requested by the user initially. Hence, it could be said that, in this particular case, 1.82 cm of this deviation is due to the inaccuracy of the kinematic technique used, with the remaining 3.07 cm due to the mechanical inaccuracy of the Hydrolek arms and associated peripherals.
Cutting operations were successfully performed on pipes of each of the three chosen materials, demonstrating successful operation of the assisted grasp and cut system. As a safety feature, the MATLAB algorithm requires a ‘go/no-go’ confirmation from the operator once the cutting tool is in place and ready to begin operations. This allows cancelling of the operation if the operator feels that collision is likely, although clearly future implementations of the system would potentially benefit from some level of prior warning of mathematically determined collisions once cutting has begun.
Figure 5 shows the Hydrolek arms in the cut and grasp positions, grasping at the top of the pipe and ready to cut near the bottom. A further limitation that was observed was that the pipe had to be located between roughly 0.75 m and 1.5 m from the base, otherwise the physical limitations rendered the cutting operation impossible. Indeed, as this distance reached these limits, oscillations within the inverse kinematic calculations ensured that gripping and cutting were compromised.
Figure 6 shows plots of the actual and required joint angles of each individual joint as the actuation is performed. One marked advantage of this method of control over standard tele-operation, is the ability of the controller to move all the joints at once, as opposed to a human operator who will be limited in their ability to multitask between joint controls. This has not only the advantage of being quicker, but is also effective in reducing operator workload. The reduced time and energy required is a desirable trait for remote systems, and indeed across many applications of hydraulic manipulators [
7].
Haptic feedback was not implemented on the gripper here, and thus it is through visual observation by the operator that the point at which the pipe is deemed to be successfully grasped is confirmed. This use of feedback has recently been explored via the use of force sensors on a manipulator to allow careful grasping of potentially delicate objects [
8,
40] and could be implemented here in further research. This would be particularly beneficial in a remotely operated environment, where the operator is not in the same room and perhaps has a poor angle of view on the object.
Figure 7 shows the cut surface of the three pipes; with the single joint actuation enabling a relatively smooth cut in the desired location. Whilst the quality of the cut is somewhat rough compared with the standard that can be achieved by dedicated robotic manufacturing equipment, the desired result of a completely cut pipe is achieved. This allows the system to be used for equipment removal tasks where pipe or structural severance is required but the smoothness of the cut edge is not important. This rudimentary method of control, is not optimised for cutting speed, although all the pipes were cut within a reasonable timeframe, as noted in
Table 3.
Figure 8 shows the progress of the saw through each of the pipes as represented by the angle of joint four. As can be seen, in none of the cases is the cut entirely smooth in terms of saw progress. This is largely down to the method of control, i.e., using short pulses to allow for slow overall movement. In the bottom right of
Figure 8, a section of the aluminium cut is plotted alongside the voltage signal provided to the solenoid valve actuating the joint. This allows the cutting action to be understood i.e., a pulsed signal produces a small movement in the saw followed by a recovery time. It can also be seen in
Figure 8 that the rate of cutting varies across the duration of the movement, despite being actuated by the same periodic signal. This is a function of both the material being cut and the angle of the number four joint at any moment. It is known that the hydraulic system contains dead zones and that the movement produced by a given signal is not constant across the full sweep of the joint. At this pulse voltage and duration, the movement speed was sufficient for cutting without risking damage to the blade. However, the saw is clearly capable of cutting at a much higher rate, as shown in
Table 3, particularly in the case of the cardboard and plastic pipes.
In order to provide a benchmark cutting time for comparison, a number of cuts were made by human operators using the same saw and blades as used by the robot arms. In
Table 3, the cutting times for each pipe material is noted when utilising manual cutting, tele-operational control and semi-autonomous control. In the tele-operational and semi-autonomous scenario’s, the material independent time taken for the cutting platform to reach its cutting position must be considered too. In the tele-operated case, after a period of practice, the non-skilled operator could reach a series of desired cutting positions within an average of 32.2 s (standard deviation 5.4 s). This compares with the autonomous method of moving the end effector into the required position, which took an average of 0.9 s (standard deviation 0.04 s). Hence, overall, the semi-autonomous approach is observed to be faster than tele-operation for these experiments.
Human operation is clearly far quicker in performing a cut, particularly in the cardboard and ABS plastic cases. The human operator cuts with the benefit of natural force feedback from holding the saw, alongside a level of acoustic feedback notifying when the saw is reaching its physical limits, allowing a much fuller use of the saw’s capacity for cutting. This demonstrates a clear potential for improved cutting speeds with the manipulator system in the form of further research into a feedback mechanism for saw movement control. At present this subject has not been greatly explored in the literature, particularly in the case of hydraulic manipulators, as fine control is more usually performed by electrical motor driven systems. Some research has been performed into the use of an acoustic signature in the case of robot arm mounted grinding tools for precision tooling in the aerospace industry [
10]. Force feedback via the use of force sensors has also found application in cases where very precise control of an end effector is required, such as in robotic surgery [
41]. In the present application, however, pure force feedback may prove less effective, due to the unpredictability of the range of materials that may require cutting. Nonetheless, it would still be advantageous to have a sense of how close the applied force is to the limit of the saw. Further feedback signals which could be used for saw feedback control include sawblade speed and temperature.
6. Conclusions
Within this work, pipes of three different materials were cut semi-autonomously with dual hydraulic manipulators which are designed to be radiation hardened naturally. COTS 3 D vision hardware and bespoke data processing capability has further been utilized to enable this operation via a grasp and cut system. This experimental work shows that the system can aid an operator in performing decommissioning tasks whilst staying remote from the site of work. Whilst the actions performed in this operation could also be controlled directly via standard tele-operational methods, the semi-autonomous approach allows for reduced operator workload, potentially allowing quicker performance of decommissioning tasks, reducing the requirement for skilled operators and improving the energy efficiency of the hydraulic manipulators through reduced actuation. This experimental work further corroborates earlier research testing of the efficacy of vision based grasp and cut assistance methods [
32], showing that such systems can provide an efficient method of manipulator control and pose determination, and enable practical cutting operations.
Of the three methods investigated, purely manual cutting of pipework is the quickest as would be expected. However, manually cutting pipework whilst in a radiologically active area may not present an ALARP scenario and thus this may not be realistically feasible. An alternative method of control while still using the physical robot platform is obviously tele-operation i.e., direct control of either the joints of the manipulators, or the position and orientation of the end effectors in 3D space. This was shown to be extremely difficult to do as the operator was not allowed sight of the scene directly, instead only via a 2 D camera. However, after a period of practice, the operations were possible with the time taken to move each arm into the required position found to be an average of 32.2 s. Cutting of the pipes themselves was also found to be possible this way, with times taken to cut the cardboard, ABS plastic, and aluminium pipes of 12.2, 16.6 and 67.2 s respectively. However, the semi-autonomous cutting system was found to be more of an effective solution overall, since the average time taken to reach the correct cutting position was only 0.9 s (with simultaneous, rather than consecutive joint actuations). The open loop cutting system did not allow for any kind of feedback and thus the cutting operations were slightly slower than the tele-operational cutting, as the cutting speed was unaffected by any outside stimuli such as audio feedback from the saw.
It is clear from the results acquired that there is scope for improved control of the cutting and an increase in speed. This may be through a more sophisticated open loop technique that allows finer control despite the dead-zone properties of the hydraulic actuator, or methods which utilise closed loop speed control with feedback from the saw itself, such as acoustic or blade speed signatures. Whilst the current cutting approach is relatively slow, the ability to cut pipes without placing personnel into radiologically active environments allows for safer performance of decommissioning tasks. The hydraulic manipulator system has been specifically chosen for its radiological hardness and, as a result, the number of electronic components in the system is greatly reduced compared to electrically actuated platforms. Whilst further validation of the Kinect system (and possible upgrade to a radiation hardened camera) would be required before deployment in highly active environments, the proposed system could already replace human workers in more moderately active decommissioning areas, where human presence is restricted due to adherence to ALARP principles and yearly dose limits. Whilst developed with nuclear decommissioning in mind, such a system could also find use in a range of hazardous environments other than radiological, such as in earthquake damaged buildings and infrastructure [
42]. The demonstration of the utility of the vision assistance system also encourages the expansion of the use of this technology in operator assistance and semi autonomy applications, such as the use of more cameras to aid the operator and improve spatial awareness. That it can aid an operator in the controlling of multiple arms simultaneously, also opens up a wider range of possible tasks, such as assisted grabbing of objects with both manipulators, lifting or dragging of heavier objects, and the use of a wider range of manipulator mounted tools.