GB2589419A - Fabric maintenance sensor system - Google Patents
Fabric maintenance sensor system Download PDFInfo
- Publication number
- GB2589419A GB2589419A GB2012375.8A GB202012375A GB2589419A GB 2589419 A GB2589419 A GB 2589419A GB 202012375 A GB202012375 A GB 202012375A GB 2589419 A GB2589419 A GB 2589419A
- Authority
- GB
- United Kingdom
- Prior art keywords
- sensor
- algorithm
- sensor system
- data
- work
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B05—SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
- B05B—SPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
- B05B13/00—Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
- B05B13/005—Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00 mounted on vehicles or designed to apply a liquid on a very large surface, e.g. on the road, on the surface of large containers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B05—SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
- B05B—SPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
- B05B13/00—Machines or plants for applying liquids or other fluent materials to surfaces of objects or other work by spraying, not covered by groups B05B1/00 - B05B11/00
- B05B13/02—Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work
- B05B13/04—Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation
- B05B13/0431—Means for supporting work; Arrangement or mounting of spray heads; Adaptation or arrangement of means for feeding work the spray heads being moved during spraying operation with spray heads moved by robots or articulated arms, e.g. for applying liquid or other fluent material to 3D-surfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B05—SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
- B05B—SPRAYING APPARATUS; ATOMISING APPARATUS; NOZZLES
- B05B9/00—Spraying apparatus for discharge of liquids or other fluent material, without essentially mixing with gas or vapour
- B05B9/03—Spraying apparatus for discharge of liquids or other fluent material, without essentially mixing with gas or vapour characterised by means for supplying liquid or other fluent material
- B05B9/04—Spraying apparatus for discharge of liquids or other fluent material, without essentially mixing with gas or vapour characterised by means for supplying liquid or other fluent material with pressurised or compressible container; with pump
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B24—GRINDING; POLISHING
- B24C—ABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
- B24C3/00—Abrasive blasting machines or devices; Plants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/005—Manipulators for mechanical processing tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0075—Manipulators for painting or coating
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
- B25J13/089—Determining the position of the robot with reference to its environment
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/022—Optical sensing devices using lasers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/005—Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/02—Manipulators mounted on wheels or on carriages travelling along a guideway
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B24—GRINDING; POLISHING
- B24C—ABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
- B24C1/00—Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods
- B24C1/003—Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods using material which dissolves or changes phase after the treatment, e.g. ice, CO2
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45066—Inspection robot
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
A sensor system 150 for a robotic apparatus (11 fig. 1B) for industrial fabric maintenance of a work object in a work location comprises a first optical system 152 for collecting a first data set relating to a work location and/or work object, a second optical system 154 for collecting a second data set relating to a work location and/or work object and at least one processing module for processing the first and second data sets. The first optical system 152 comprises an optical camera, and the first data set comprises camera imaging data. The second optical system 154 comprises a laser positioning system, and the second data set comprises laser positioning data. The sensor system is operable to process the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy. The sensor system is operable to process the second data set to locate a robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy. The second resolution or accuracy is higher than the first resolution or accuracy. The system may be used for inspection, surface preparation or coating.
Description
1 FABRIC MAINTENANCE SENSOR SYSTEM 3 The present invention relates to inspection, surface preparation and coating of objects and 4 structures in an industrial environment. Aspects of the invention relate to a sensor system for industrial fabric maintenance of a work objects and structures in a work location. The 6 invention has particular application to the blasting and painting of steel surfaces.
8 Background to the invention
Large industrial complexes including oil and gas platforms, petrochemical plants and 11 refineries contain significant areas of steel surfaces which are often exposed to corrosive 12 environments and so must be regularly protected against the elements.
14 The inspection, surface preparation and coating of these structures (known as "fabric maintenance") requires an array of human labour, handheld tools, machines and safety 16 and power equipment.
18 Approaches to fabric maintenance vary but rely on manual or remote-controlled methods 19 of inspection, surface preparation and coating. With a reliance on human labour, asset owners can often store a backlog of required fabric maintenance which can equate to 21 many years of required effort.
23 An important part of surface preparation is ensuring that the prepared surface meets the 24 specification required by the end user. This typically includes visual inspection to verify that any previous coating / rust has been removed and that the surface does not have 26 excessive dust. It is also important that salt levels are sufficiently low; and that the surface 27 roughness is sufficient to hold the coating.
29 Maintenance operations of structures and objects in an industrial environment are complex due to congested work areas. There are safety issues with conducting surface 31 preparations such as sand blasting and transporting equipment without colliding with 32 infrastructure or other equipment.
1 Summary of the invention
3 It is an object of an aspect of the present invention to obviate or at least mitigate the 4 foregoing disadvantages of fabric maintenance prior art.
6 It is another object of an aspect of the present invention to provide a sensor system for 7 industrial fabric maintenance of a work object in a work location in a safe and effective 8 manner.
It is a further object of an aspect of the present invention to provide a sensor system 11 capable of conducting industrial fabric maintenance of a work object reliably, 12 autonomously and to a high standard.
14 Further aims of the invention will become apparent from the following description.
16 According to a first aspect of the invention there is provided a sensor system for a robotic 17 apparatus for industrial fabric maintenance of a work object in a work location, the sensor 18 system comprising: 19 a first optical system for collecting a first data set relating to a work location and/or work object; 21 a second optical system for collecting a second data set relating to a work location and/or 22 work object; 23 at least one processing module for processing the first and second data sets; 24 wherein the first optical system comprises an optical camera, and the first data set comprises camera imaging data; 26 wherein the second optical system comprises a laser positioning system, and the second 27 data set comprises laser positioning data; 28 wherein the sensor system is operable to process the first data set to locate the robotic 29 apparatus in relation to the work location and/or work object to a first resolution or accuracy, and wherein the sensor system is operable to process the second data set to 31 locate a robotic apparatus in relation to the work location and/or work object to a second 32 resolution or accuracy, wherein the second resolution or accuracy is higher than the first 33 resolution or accuracy.
1 The industrial fabric maintenance of a work object in a work location may include 2 inspection operations, surface preparation operations and/or coating operations.
4 The robotic apparatus may comprise a robotic manipulator assembly. The robotic manipulator assembly may comprise at least one functional module at an operative end of 6 the robotic manipulator assembly.
8 The sensor system may be configured to locate, orientate, control and/or position the 9 robotic apparatus, robotic manipulator assembly and/or functional module. The sensor system according to any preceding claim wherein the sensor system is configured to 11 locate, orientate, control and/or position the robotic apparatus, or at least a part thereof, 12 during, before or after a fabric maintenance operation.
14 The sensor system may be configured to monitor the movement of the robotic apparatus, robotic manipulator assembly and/or functional module. The sensor system may be 16 configured to monitor translation movement of the robotic apparatus, robotic manipulator 17 assembly and/or functional module.
19 The sensor system may be configured to locate, orientate, control and/or position the robotic apparatus, robotic manipulator assembly and/or functional module in relation to the 21 work location and/or work object in accordance with a plan. The sensor system may be 22 configured to monitor the movement of the robotic apparatus, robotic manipulator 23 assembly and/or functional module and the at least one processing module may compare 24 the sensor data set with a plan. The at least one processing module may be configured to compare the first data set and/or the second data set with a plan.
27 The sensor system according to any preceding claim wherein the at least one processing 28 module may use an existing plan or may generate a plan to locate, orientate, control 29 and/or position the robotic apparatus, or at least a part thereof during a fabric maintenance operation.
32 The plan may comprise a movement sequence for the robotic manipulator assembly or 33 functional module. The movement sequence may define the movement path for the 34 functional module. The plan may comprise at least a head movement path plan for the functional module in relation to the work object.
1 The apparatus may be configured to present the movement path to an operator, which 2 may be a visual representation of the movement path in a virtual representation of the 3 work location.
The functional module may be selected from the group consisting of an inspection system, 6 a surface treatment system or a coating system.
8 The surface treatment system may be selected from the group consisting of a water jetting 9 system, a dry ice blasting system and an abrasive blasting system such as a grit blasting system. The dry ice blasting, grit blasting, water or steam jetting systems may either be 11 closed or open loop. The closed loop system may contain two or more hoses with at least 12 one hose configured to project the dry ice, grit, steam or water medium onto the required 13 surface and at least one hose acting as a return line for the residue dry ice, grit, steam or 14 water medium and treated surface waste such as rust.
16 The coating device may be a spray paint device. The coating may be a paint. The paint 17 may be a clear paint, a mica paint, a metallic paint, a water-based paint, a solvent-based 18 paint and/or a multi-component paint.
The sensor system may comprise at least one sensor. The at least one sensor may be 21 selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet 22 film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red 23 sensor, proximity sensor, inductive sensor, or a lidar sensor. The sensor system may 24 comprise two or more sensors.
26 The sensor system data may be processed to localise the robotic manipulator assembly 27 and/or functional module with respect to the work location and/or work object. The sensor 28 data may be processed using algorithms to recognize a workpiece and estimate its most 29 likely position. A model or a scan of the work location and/or work object may estimate where the robotic manipulator assembly and/or functional module is in relation the work 31 location and/or work object. This may be a probabilistic estimate which may be updated 32 using information gathered from the sensor system.
34 The at least one processing module may use sensor data to create a probabilistic map of free and occupied volumes that is updated. This map may be updated in real time. This 1 may be used in a simulated environment for path planning to ensure that there are no 2 collisions.
4 The sensor system may collect data relating to the performance of the operation over the area. The sensor system may identify parts of the work area that require a further 6 operation. The at least one processing module may generate a second plan for carrying 7 out a further fabric maintenance on the work area. For example, the sensor system may 8 inspect the quality of the operation and identify areas that require a repeat operation.
The work object may be located in a booth. The fabric maintenance operation may be 11 conducted in a paint and/or blast booth. The booths may be a sealed area that allows 12 inspection, painting or blasting. In the case of blasting the booth may allow open blasting 13 (grit fired at objects to remove rust and other contaminants). The booth may have a grit 14 recovery system that allow for the grit to be reused. The booth may comprise a standard 20ft or 40ft container.
17 The sensor system may use an existing plan or generate a plan. The sensor system may 18 compare the operation of the robotic apparatus, robotic manipulator assembly and/or 19 functional module with the plan. The sensor system may compare at least one factor with the plan. The at least one factor may include location, orientation, position and/or velocity 21 of the robotic apparatus, robotic manipulator assembly and/or functional module. The at 22 least one factor may include spray rate, surface condition, distance of functional module to 23 object and/or angle of functional module to the work object.
The sensor system may adjust or change the plan. The plan may be adjusted to the 26 operator's specifications. The sensor system may adjust or change the plan in response to 27 sensor data. The sensor system may adjust or change the plan to adapt changes in the 28 environment where the work is to be performed and/or to avoid obstacles. The sensor 29 system may present the revised plan to an operator for approval. The plan may be adjusted in real time or repeated after the first plan has been executed.
32 The apparatus may be configured to undertake fabric maintenance tasks remotely and/or 33 autonomously. The sensor system may be configured to enable real time feedback to 34 assess the operation of the functional module and/or the quality of the fabric maintenance.
1 The sensor system may be configured to analyse data from the first data set, second data 2 and/or from the at least one sensor to assess the operation of the functional module and/or 3 the quality of the fabric maintenance.
The sensor system may be configured to avoid obstacles and collisions with the external 6 environment and/or personnel. The sensor system may be configured to analyse data from 7 the first data set, second data and/or from the at least one sensor to avoid obstacles and 8 collisions.
The sensor system, or at least a part thereof, may be mounted on or connected to the 11 robotic apparatus, robotic manipulator assembly and/or functional module. The sensor 12 system, or at least a part thereof, may be mounted or connected to a functional module, 13 which may be removably connected to the robotic apparatus, such that the sensor system 14 or part thereof is removably connected to the robotic apparatus with the functional module.
The sensor system, or at least a part thereof, may be mounted on or connected to a 16 portable and/or handheld unit.
18 According to a second aspect of the invention there is provided a method of operating a 19 robotic device for industrial fabric maintenance of a work object in a work location, the method comprising: 21 providing a robotic apparatus with a sensor system, the sensor system comprising: 22 a first optical system; 23 a second optical system; 24 at least one processing module for processing the first and second data sets; collecting a first data set relating to a work location and/or work object using the first 26 optical system; 27 collecting a second data set relating to a work location and/or work object using the 28 second optical system; 29 processing the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy; 31 processing the second data set to locate the robotic apparatus in relation to the work 32 location and/or work object to a second resolution or accuracy, wherein the second 33 resolution or accuracy is higher than the first resolution or accuracy.
1 The term optical system may mean a collection of optical sensors or a single optical 2 sensor. The first optical system may comprise an optical camera. The method may 3 comprise collecting camera imaging data as a first data set. The second optical system 4 may comprise a laser positioning system. The method may comprise collecting laser positioning data as a second data set.
7 The method may comprise locating, orientating, controlling and/or positioning the robotic 8 apparatus, or at least a part thereof, in response to the first and/or second data set.
9 The method may comprise continually collecting and processing data from the first and/or second optical system during, before and/or after a fabric maintenance operation.
12 The method may comprise monitoring the movement of the robotic apparatus during, 13 before or after a fabric maintenance operation. The method may comprise comparing the 14 location and/or movement of the robotic apparatus, or at least a part thereof, with a plan.
The plan may be an existing plan. The method may comprise generating a new plan.
17 The method may comprise correcting the movement, location, orientation, and/or position 18 of the robotic apparatus, or at least a part thereof to bring it into conformity with the plan.
The plan may comprise a movement sequence for the robotic manipulator assembly or 21 functional module. The movement sequence may define the movement path for the 22 functional module. The plan may comprise at least a head movement path plan for the 23 functional module in relation to the work object.
Embodiments of the second aspect of the invention may include one or more features of 26 the first aspect of the invention or its embodiments, or vice versa.
28 According to a third aspect of the invention there is provided a sensor system for a robotic 29 apparatus for industrial fabric maintenance of a work object in a work location, the sensor system comprising: 31 at least one optical system for collecting a data set relating to a work location and/or work 32 object; 33 at least one processing module for processing the data set; 34 wherein the sensor system is operable to process the data set to locate and/or orientate the robotic apparatus in relation to the work location and/or work object.
1 The sensor system may comprise two or more optical systems. The at least one 2 processing module may be configured to process a data set from each of the optical 3 systems.
The at least one processing modules may process the collected data using an algorithm.
7 The at least one processing modules may process the collected data using at least two 8 algorithms. The at least one processing modules may process the collected data using at 9 least two parts of one algorithm. A first algorithm may process the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or 11 accuracy. A second algorithm may process the data to locate the robotic apparatus in 12 relation to the work location and/or work object to a second resolution or accuracy.
A second algorithm may process the data to locate the robotic apparatus in relation to the work location and/or work object to augment the resolution or accuracy of the first algorithm by combining the information from the second algorithm with that from the first algorithm. The algorithms may be combined together such that they are sub-algorithms in a larger algorithm.
The at least two algorithms may be considered to be at least two separate algorithms or at 16 least two parts of the same algorithm. An algorithm is considered to be made up of at least 17 one component where a core part of the algorithm may be used to complete an intended 18 purpose of the algorithm. However, aspects of the core part of the algorithm may be 19 augmented to improve the performance of the algorithm. The core part of the algorithm may be augmented by a second component of the same algorithm or by another 21 algorithm.
23 An algorithm may comprise two (or more) components where one of those components 24 could be used as part of a system independently without the second component to perform a task such as to estimate the position of a work piece is considered to be two algorithms.
27 An algorithm may have multiple components that work together to provide a single 28 accuracy each component may provide independent information that is complementary 29 with the other components. For example, a core part of an algorithm may process small movements between each data frame that is captured (say at 30 frames per second) and 1 another part of the algorithm may independently identify when the apparatus intersects 2 with a previous position and closes a loop. Where a loop closure is detected, this may be 3 used to improve the accuracy of the core part of the algorithm.
Multiple parts of one algorithm may be considered to be separate algorithms. For example, 6 a first part of an algorithm which identifies a loop closure may be used independently to a 7 second part of the algorithm which may be used to estimate the position of the workpiece.
8 The two parts of the same algorithm may be used independently, albeit with lower 9 accuracy, and may be considered as two different algorithms.
11 The at least one processing module may be configured to process a first data set from a 12 first optical system to locate the robotic apparatus in relation to the work location and/or 13 work object to a first resolution or accuracy. The at least one processing module may be 14 configured to process a second data set from a second optical system to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or 16 accuracy. The second resolution or accuracy may be higher than the first resolution or 17 accuracy.
19 The optical system may comprise a camera and/or a laser system. The least one processing module may process data from the camera and/or a laser system.
22 The sensor system or at least a part thereof, may be mounted to or located on the robotic 23 apparatus. The sensor system or at least a part thereof, may be mounted to or located on 24 a robotic manipulator assembly and/or a functional module at an operative end of the robotic manipulator assembly. The sensor system may be configured to locate, orientate 26 and/or position the robotic apparatus, robotic manipulator assembly and/or functional 27 module in relation to the work location and/or work object in accordance with a plan.
29 The functional module may be selected from the group consisting of an inspection system, a surface treatment system or a coating system.
32 The sensor system may comprise at least one sensor. The at least one sensor may be 33 selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet 34 film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
2 The sensor system may be configured to move, locate, orientate and/or position the 3 robotic apparatus, robotic manipulator assembly and/or functional module according to an 4 existing plan and/or generate a plan to move, locate, orientate and/or position the robotic manipulator assembly and/or a functional module relative to the work object in a work 6 location.
8 The apparatus may be configured to undertake fabric maintenance tasks remotely and/or 9 autonomously. The sensor system may be configured to enable real time feedback to assess the operation of the functional module and/or the quality of the fabric maintenance.
11 The sensor system may be configured to avoid obstacles and collisions with the external 12 environment and personnel.
14 Embodiments of the third aspect of the invention may include one or more features of the first or second aspects of the invention or their embodiments, or vice versa 17 According to a fourth aspect of the invention, there is provided an apparatus for industrial 18 fabric maintenance of a work object in a work location, the apparatus comprising: 19 a robotic manipulator assembly; a sensor system operable to collect data relating to the work location and/or work object; 21 and a functional module at an operative end of the robotic manipulator assembly; 22 wherein the functional module comprises: 23 a first inlet and a first outlet for a surface treatment medium; 24 a first conduit between the first inlet and first outlet; and a connector for removably connecting the functional module to the robotic 26 manipulator assembly such that the first inlet is coupled to a source of the surface 27 treatment medium; 28 wherein the functional module comprises a shape or form selected for delivery of the 29 surface treatment medium in dependence on the geometry of a work area on the work object.
32 The apparatus may be configured to move, orientate, located and/or position the functional 33 module in response to feedback from at least one sensor in the sensor system. The 34 sensor system may be configured to locate, orientate, control and/or position the robotic 1 apparatus, robotic manipulator assembly and/or functional module in relation to the work 2 location and/or work object in accordance with a plan.
4 The plan may comprise a robotic movement sequence plan for the robotic manipulator assembly. The robotic movement sequence plan may define the head movement path for 6 the functional module.
8 The functional module may comprise a surface preparation head, and the surface 9 treatment medium may be a surface preparation medium. The functional module may comprise a surface coating head, and the surface treatment medium may be a surface 11 coating medium. The surface coating medium may comprise a paint.
13 The shape or form of the functional module may comprise a head surface profile, which 14 may be configured to be presented to the work area of the work object, and which may be configured to engage or otherwise interact with the work area of the work object. The head 16 surface profile may comprise a substantially flat or flat planar surface configured to be 17 presented to the work area of the work object. Such a head surface profile may be 18 particularly suitable for a substantially flat or flat planar work area surface.
The head surface profile may comprise a concave surface configured to be presented to 21 the work area of the work object. Such a head surface profile may be particularly suitable 22 for a convex work area surface. The head surface profile may comprise a convex surface 23 configured to be presented to the work area of the work object. Such a head surface 24 profile may be particularly suitable for a concave work area surface.
26 The head surface profile may comprise a cylindrical or part-cylindrical surface configured 27 to be presented to the work area of the work object. Such a head surface profile may be 28 particularly suitable for a cylindrical or part-cylindrical work area surface, for example the 29 surface of a pipe. The head surface profile may comprise a surface configured to be presented to the work area of the work object that is curved with respect to two orthogonal 31 axes.
33 The head surface profile may comprise one or more surface projections configured to be 34 presented to the work area of the work object. Such a head surface profile may be particularly suitable for a recess, groove or relief in a work area surface.
1 The apparatus may comprise one or more sensors. The one or more sensors may be 2 mounted on the functional module, such that they are removably connected to the robotic 3 manipulator assembly with the functional module.
Alternatively, the one or more sensors may be mounted on the robotic manipulator 6 assembly, for example such that they remain on the robotic manipulator assembly when 7 the functional module is removed.
9 Where the apparatus comprises a plurality of sensors, a first subset of the sensors may be mounted on the functional module, and a second subset of sensors may be mounted on 11 the robotic manipulator assembly. The one or more sensors may form a part of a sensor 12 system. The sensor system may comprise a first optical system for collecting a first data 13 set relating to a work location and/or work object and may comprise a second optical 14 system for collecting a second data set relating to a work location and/or work object. The sensor system may comprise at least one processing module for processing the first and 16 second data sets.
18 The first optical system may comprise an optical camera, and the first data set may 19 comprise camera imaging data. The second optical system may comprise a laser positioning system, and the second data set may comprise laser positioning data.
22 Preferably, the sensor system is operable to process the first data set to locate the robotic 23 apparatus in relation to the work location and/or work object to a first resolution or 24 accuracy. More preferably, and the sensor system is operable to process the second data set to locate a robotic apparatus in relation to the work location and/or work object to a 26 second resolution or accuracy, wherein the second resolution or accuracy is higher than 27 the first resolution or accuracy.
29 Embodiments of the fourth aspect of the invention may include one or more features of the first to third aspects of the invention or their embodiments, or vice versa 32 According to a fifth aspect of the invention, there is provided a functional module for 33 removable connection to the apparatus of the first or third aspect of the invention.
1 The apparatus may be configured to orientate, position, locate and/or move the functional 2 module based on sensor data. The apparatus may be configured to orientate, position, 3 locate and/or move the functional module according to a plan. The plan may comprise a 4 robotic movement sequence plan for the robotic manipulator assembly. The robotic movement sequence plan may define the head movement path for the functional module.
7 Embodiments of the fifth aspect of the invention may include one or more features of the 8 first to fourth aspects of the invention or their embodiments, or vice versa.
According to a sixth aspect of the invention, there is provided a modular system of 11 components comprising the apparatus of the third aspect of the invention and a plurality of 12 functional modules interchangeable on the robotic manipulator assembly of the apparatus, 13 wherein each functional module wherein the functional module comprises a shape or form 14 selected for delivery of the surface treatment medium in dependence on the geometry of a work area on the work object.
17 Embodiments of the sixth aspect of the invention may include one or more features of the 18 first to fifth aspects of the invention or their embodiments, or vice versa.
According to a seventh aspect of the invention, there is provided a method of performing a 21 fabric maintenance operation using the apparatus according to first or third aspects of the 22 invention or the system according to the fifth aspect of the invention, the method 23 comprising removing a first functional module from the apparatus, and connecting a 24 second functional module to the apparatus.
26 Embodiments of the seventh aspect of the invention may include one or more features of 27 the first to sixth aspects of the invention or their embodiments, or vice versa.
29 According to an eighth aspect of the invention there is provided a sensor system for a robotic apparatus for industrial fabric maintenance of a work object in a work location, the 31 sensor system comprising: 32 at least one sensor for collecting a data set relating to a work location and/or work object; 33 at least one processing module for processing the data set; 34 wherein the sensor system is operable to process the data set to locate and/or orientate the robotic apparatus in relation to the work location and/or work object.
1 The at least one sensor may be selected from the group consisting of a camera, laser, a 2 range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement 3 unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar 4 sensor. The sensor system may comprise two or more sensors. The at least one processing module may be configured to process a data set from each of the sensors.
7 The at least one processing module may be configured to process a first data set from a 8 first sensor to locate and/or orientate the robotic apparatus in relation to the work location 9 and/or work object to a first resolution or accuracy. The at least one processing module may be configured to process a second data set from a second sensor to locate and/or 11 orientate the robotic apparatus in relation to the work location and/or work object to a 12 second resolution or accuracy. The second resolution or accuracy may be higher than the 13 first resolution or accuracy.
The first sensor may be a camera. The second sensor may be a laser system.
17 Embodiments of the eighth aspect of the invention may include one or more features of the 18 first to seventh aspects of the invention or their embodiments, or vice versa.
19 According to a ninth aspect of the invention there is provided a method of operating a robotic device for industrial fabric maintenance of a work object in a work location, the 21 sensor system comprising: 22 providing a robotic apparatus with a sensor system, the sensor system comprising: 23 at least one sensor; 24 at least one processing module for processing at least one data set from the at least one sensor; 26 collecting at least one data set relating to a work location and/or work object using the at 27 least one sensor; 28 processing the at least one data set to locate and/or orientate the robotic apparatus in 29 relation to the work location and/or work object to a first resolution or accuracy.
31 The at least one sensor may be selected from the group consisting of a camera, laser, a 32 range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement 33 unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar 34 sensor.
1 The method may comprise providing two or more sensors. The method may comprise 2 collecting a first data set relating to a work location and/or work object a first sensor; 3 collecting a second data set relating to a work location and/or work object using a second 4 sensor; processing the first data set to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy; processing the second data 6 set to locate the robotic apparatus in relation to the work location and/or work object to a 7 second resolution or accuracy, wherein the second resolution or accuracy may be higher 8 than the first resolution or accuracy.
Preferably the first sensor is an optical system such as an optical camera. Preferably the 11 second sensor is an optical system such as a laser system.
13 Embodiments of the ninth aspect of the invention may include one or more features of the 14 first to eighth aspects of the invention or their embodiments, or vice versa.
16 According to a tenth aspect of the invention there is provided a sensor system for a robotic 17 apparatus for industrial fabric maintenance of a work object in a work location, the sensor 18 system comprising: 19 a first optical sensor for collecting data relating to a work location and/or work object; a second optical sensor for collecting data relating to a work location and/or work object; 21 and one or more processing modules for processing the collected data; 22 wherein the sensor system is operable to process the data to locate the robotic apparatus 23 in relation to the work location and/or work object to a first resolution or accuracy.
The sensor system may be operable to process the data to locate a robotic apparatus in 26 relation to the work location and/or work object to a second resolution or accuracy, 27 wherein the second resolution or accuracy is higher than the first resolution or accuracy.
29 The first optical sensor may be an optical camera for collecting a first data set comprising camera imaging data. The second optical sensor may be an optical camera for collecting a 31 second data set comprises camera imaging data or the second optical sensor may be a 32 laser positioning system and a second data set comprises laser positioning data.
34 The industrial fabric maintenance may be an inspection operation, surface preparation operation and/or coating operation.
1 The sensor system may be configured to locate, orientate, control and/or position the 2 robotic apparatus, or at least a part thereof, during, before or after a fabric maintenance 3 operation.
The sensor system may comprise at least one further sensor, wherein the at least one 6 further sensor may be selected from the group consisting of a camera, laser, a range 7 sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, 8 ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
9 The sensor system, or at least a part thereof, may be mounted on the robotic apparatus.
The sensor system, or at least a part thereof, may be mounted on a functional module of 11 the robotic apparatus.
13 The at least one processing module may use an existing plan or generates a plan to 14 locate, orientate, control and/or position the robotic apparatus, or at least a part thereof during a fabric maintenance operation.
17 The at least one processing module may process the collected data using at least two 18 algorithms. A first algorithm may process the data to locate the robotic apparatus in 19 relation to the work location and/or work object to a first resolution or accuracy.
21 A second algorithm may process the data to locate the robotic apparatus in relation to the 22 work location and/or work object to provide a second resolution or accuracy. The second 23 algorithm may process to augment the resolution or accuracy of the first algorithm by 24 combining information from the second algorithm with that from the first algorithm. The algorithms may be combined together such that they are sub-algorithms in a larger 26 algorithm.
28 Data from the first optical system, second optical system and/or at least one further sensor 29 may be processed to follow a movement sequence which defines the movement path for a functional module of the robotic apparatus.
32 The functional module may be selected from the group consisting of an inspection system, 33 a surface treatment system or a coating system. The surface treatment system may be 34 selected from the group consisting of a water jetting system, a dry ice blasting system and an abrasive blasting system. The coating system is a paint system.
1 The sensor system may be configured to undertake fabric maintenance tasks remotely 2 and/or autonomously. The sensor system may be configured to assess the operation of 3 the functional module and/or the quality of the fabric maintenance. The sensor system may 4 be configured to avoid obstacles and collisions with the external environment and/or personnel.
7 Embodiments of the tenth aspect of the invention may include one or more features of the 8 first to ninth aspects of the invention or their embodiments, or vice versa.
According to an eleventh aspect of the invention there is provided a method of operating a 11 robotic apparatus for industrial fabric maintenance of a work object in a work location, the 12 method comprising: 13 providing a robotic apparatus with a sensor system, the sensor system comprising: 14 a first optical sensor for collecting data relating to a work location and/or work object; a second optical sensor for collecting data relating to a work location and/or work object; 16 at least one processing module for processing collected data; and 17 processing the data to locate the robotic apparatus in relation to the work location and/or 18 work object to a first resolution or accuracy.
The method may comprise processing the data to locate the robotic apparatus in relation 21 to the work location and/or work object to a second resolution or accuracy; wherein the 22 second resolution or accuracy is higher than the first resolution or accuracy.
24 The method may comprise locating, orientating, controlling and/or positioning the robotic apparatus, or at least a part thereof, in response to the processed data.
27 The method may comprise processing the collected data using at least two algorithms.
29 The method may comprise processing the data with a first algorithm to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or 31 accuracy.
33 The method may comprise processing the data with a second algorithm to locate the 34 robotic apparatus in relation to the work location and/or work object to provide a second resolution or accuracy. The method may comprise processing the data with a second 1 algorithm to augment the resolution or accuracy of the first algorithm by combining 2 information from the second algorithm with that from the first algorithm.
4 The method may comprise processing the collected data using at least two independent parts of one algorithm. The method may comprise processing the data with a first part of 6 one algorithm to locate the robotic apparatus in relation to the work location and/or work 7 object to a first resolution or accuracy. The method may comprise processing the data with 8 a second part of one algorithm to locate the robotic apparatus in relation to the work 9 location and/or work object to provide a second resolution or accuracy. The method may comprise processing the data with a second part of one algorithm to augment the 11 resolution or accuracy of the first algorithm by combining information from the second part 12 of one algorithm with that from the first part of one algorithm.
14 The algorithms may be combined together such that they are sub-algorithms in a larger algorithm.
17 The method may comprise monitoring the movement of the robotic apparatus during, 18 before and/or after a fabric maintenance operation. The method may comprise comparing 19 the location of the robotic apparatus, or at least a part thereof, with a plan.
21 Embodiments of the eleventh aspect of the invention may include one or more features of 22 the first to tenth aspects of the invention or their embodiments, or vice versa.
24 Brief description of the drawings
26 There will now be described, by way of example only, various embodiments of the 27 invention with reference to the drawings, of which: 29 Figure 1A and 1B are a sketch and a schematic view of a fabric maintenance system for inspection, surface preparation and coating of steel structures according to the invention; 32 Figure 2 is a schematic view of different elements of a sensor system which may be 33 incorporated into the fabric maintenance system of Figure 1; Figure 3A and 3B are a sketch and an enlarged schematic view of a manipulator arm of a 2 fabric maintenance system with a sensor system according to the invention; 4 Figure 4A is a flow diagram showing an example blasting operation surface preparation according to an embodiment of the invention; 7 Figure 4B is a flow diagram showing an example paint operation of a prepared surface 8 according to an embodiment of the invention; Figure 5 shows a fabric maintenance system set up for surface preparation according to 11 an embodiment of the invention; 13 Figure 6 is a schematic view of selection of different end effector heads that may be 14 removably attached to the fabric maintenance system of Figure 1A; 16 Figure 7 is schematic view of a fabric maintenance system with an end effector head 17 attachment system according to an embodiment of the invention.
19 Figure 8A and 88 are side and end profile schematic views of a fabric maintenance system mounted on a rail according to the invention; 22 Figure 9A is schematic side view of a fabric maintenance system mounted on a rail with a 23 powered mechanism according to the invention; Figure 9B is perspective schematic view of a component of the powered mechanism 26 described in Figure 9A; and 28 Figure 10A and 10B are side and perspective views of a fabric maintenance system 29 mounted on a rail according to the invention.
31 Detailed description of preferred embodiments.
33 Figures 1A and 1B show a fabric maintenance system 10 for inspection, surface 34 preparation and coating of steel structures according to the invention.
The system 10 comprises a robotic manipulator assembly 11 having a base 12 mounted 2 on endless tracks 14. Each of the endless tracks are mounted on rollers 15. An 3 articulated manipulator arm 16 is pivotally mounted at first end 16a on the base 12. The 4 manipulator arm 16 has joints 18 which may be individually actuated to provide the arm 16 with at least six degrees of freedom. The manipulator arm 16 carries an end effector head 6 mount 20 which is movably secured at a second end of the manipulator arm 16b.
8 A variety of end effector heads 22 may be reversibly fixed to the end effector head mount 9 20 depending on the desired application including inspection, surface preparation and/or coating operation. A variety of end effector heads 22 may also be provided depending on 11 the geometry of the surface to be treated. This is discussed further in relation to Figure 6 12 below.
14 It will be appreciated that the system may be used for a number of different applications including surface preparation, inspection, or coating operation. Surface preparation (e.g. 16 water jetting) may be used for the purpose of preparing an area for non-destructive testing 17 (NOT) and/or to prepare a surface for a coating operation. The NOT work could include 18 ultrasonic (such as phased array) or radiography. The manipulator could also conduct the 19 inspection work by use of an inspection head. Following the surface preparation and/or inspection operations, application of a coating may or may not be required.
22 The different mountable end effector heads 22 enable the system to conduct inspection, 23 surface preparation and coating operations. The inspection operations include quality 24 control checks of the treated surfaces such as blasted surfaces and painted surfaces. The surface preparation operations include dry ice blasting, grit blasting or water jetting.
27 The dry ice blasting, grit blasting or water jetting system may either be closed or open 28 loop. In a closed loop system the end effector heads comprise at least a first conduit to 29 deliver or dispense medium (dry ice, grit or water) to the surface to be treated and a second conduit for suction or removal of the waste medium and contaminants for the 31 treated surface.
33 The closed loop end effector heads may comprise bristles or fibres around its surface 34 engaging periphery which acts as a curtain or screen to assist in the controlled delivery of medium (dry ice, grit or water) from the end effector head to the surface to be treated and 1 the containment and recirculation of the waste medium and contaminants to the second 2 conduit in the end effector head.
4 The coating operations may include painting and spray painting.
6 As an example, Figure 1 shows the robotic manipulator assembly 11 is set up for quality 7 control for inspection operations; water jetting, dry ice blasting and grit blasting as surface 8 preparation operations; and spray painting as a coating operation.
The system 10 is connected to a dry ice reservoir 32, a grit reservoir 34, a water reservoir 11 36 and to a paint reservoir 38, via conduits 32a, 34a, 36a and 38a respectively. Although 12 these are shown in Figure 1A as individual conduits, these may be bundled together and 13 housed into a single conduit called an umbilical.
The dry ice reservoir 32, grit reservoir 34, water reservoir 36 and paint reservoir 38 are 16 connected to a compressor 40 via pressure lines 40a to enable the dry ice, grit, water and 17 paint to be dispensed under pressure.
19 It will be appreciated that the system may not comprise or use a dry ice reservoir, grit reservoir, water reservoir and paint reservoir but a selection or combination of these 21 depending on type of fabric maintenance application, work scope, client objectives, type of 22 material to be worked on and the type of environment where the work is to be performed 23 (e.g. inside an oil storage vessel, outdoors in an potentially explosive environment, on a 24 helideck etc.).
26 It will be appreciated that the conduits 32a, 34a, 36a and 38a either individually or an 27 umbilical when the conduits are bundled together, may be spooled on a reel to assist with 28 conduit handling and management. The conduits or umbilical may be paid out when 29 required and spare conduit or umbilical removed when not.
31 The conduits 32a, 34a, 36a and 38a either individually or an umbilical when the conduits 32 are bundled together, may be connected to a plurality of small rollers to provide low friction 33 on the surface when the conduit or umbilical follows the automated device. The umbilical 34 rollers may be powered such that the umbilical can be moved to avoid collisions. The robotic manipulator assembly 11 has a sensor system 50. The sensor system may be 1 located on the robotic manipulator assembly 11, manipulator arm 16, end effector head 2 mount 20 and/or the end effector head 22.
4 It will be appreciated that components of the sensor system may be located on different parts of the robotic manipulator assembly 11, manipulator arm 16, end effector head 6 mount 20 and the end effector head 22.
8 In this example the sensor system 50 is located on the end effector head 22. In this 9 example the sensor system 50 comprises cameras 52 and a laser-based system 54.
However, it will be appreciated that the camera 52 may be used without the laser-based 11 system where accurate positioning is not required.
13 The camera and laser-based sensor system enables the robotic manipulator assembly 11 14 and the connected end effector head 22 to be precisely positioned relative to the surface of the structure or object being worked on.
17 The fabric maintenance system 10 autonomously creates path for movement of the robotic 18 manipulator assembly, manipulator arm and/or the end effector head to avoid potential 19 obstacles in the surrounding environment and to undertake an effective treatment of the surface. The system may create a path plan based on cartesian coordinates to control the 21 overall movement of the robotic manipulator assembly, manipulator arm and/or the end 22 effector head during the treatment operation and a plan "pick and place" plan of how the 23 blasting sequence should be conducted.
This allows the robotic manipulator assembly to perform surface preparation and painting 26 operations in congested areas safely. A handheld controller 60 may be connected to the 27 robotic manipulator assembly 11 to control the operation and movement of the robotic 28 manipulator assembly 11.
In this example, the handheld controller 60 includes a position tracking system 62 which 31 may include a camera 64, rangefinder 66 and/or a lidar based system 68. The position 32 tracking system 62 enables the user to identify and track the position of objects in a 3D 33 environment such that instructions for the robotic manipulator assembly 11 can be created.
34 The tracked movements may be recorded as a series of stored instructions.
1 Creation of work scope for the robot assembly may comprise of some or all of the following 2 steps: 4 a. use of an existing 3D model or scan or generation of a 3D model; b. the user selecting on the 3D model the area to be worked on; 6 c. running the path generation algorithm; 7 d. simulation of the process (such as to verify the path and define the speed (e.g. 8 paint thickness modelling using the paint spray cone) 9 e. presentation of the planned path to the user for approval; f. translation of the planned path into robot coordinate frame based on robot 11 localisation with respect to the work piece.
12 g. updating of the path during the process to ensure the correct distance and angle to 13 the workpiece is maintained and/ or re-planning to loop back over areas (e.g. during 14 blasting) and/ or changing of the velocity (e.g. if blasting time needs to be longer than expected); and 16 h. appending information to the location in the generated 3D model (e.g. visual 17 confirmation of successful blasting).
19 A. Generation of 3D model 21 Where a 3D model is not available, it may need to be generated. This can be either using 22 laser based systems (lidar) or camera based systems (e.g. photogrammetry). A handheld 23 controller (also known as a handheld scanner) may be typically used for the scanning. The 24 handheld controller includes a position tracking system which may include a camera, rangefinder and/or a lidar based system. The position of the handheld controller will be 26 calculated using a simultaneous localisation and mapping system. The system works by 27 estimating between successive sensor measurements the movement of the handheld 28 controller.
The handheld controller tracks key points On 2D or 3D) that have a degree of rotational 31 and translation invariance (approximately 20 degrees rotation and 30cm translation).
32 These points are used to estimate the motion of the handheld scanner between sensor 33 readings. This estimate is augmented with inertial measurement unit data of the rotational 34 and linear accelerations to provide an estimate with less uncertainty. If the estimate 1 position of the current sensor measurements is sufficiently different from the last recorded 2 frame, a new frame will be inserted into a map.
4 The frame will have the six degree of freedom pose and will be linked to the previous recoded frame along with the uncertainty of the measurement. In addition to this a global 6 system of identifying whether a loop has been made will operate. This will aim to find 7 correspondences with frames recorded in other parts of the map which might (based on 8 probability distribution) be part of a loop. When a loop is detected all of the poses in the 9 map are updated to reflect the new information that a loop has been closed. The pose will then be used to fuse point clouds captured from individual positions together.
12 B. User area selection 14 Prior to user selection the 3D points generated from the scan data or where available from an existing model are converted into a surface representation. An algorithm is run to 16 identify standard geometries and sharp edges. These are used to segment the model into 17 a series of sub models. A machine learning algorithm can optionally classify the surface to 18 select optimal meshing parameters (such as the algorithm type and the smoothing 19 parameters). The edges have lines or curves fitted to them to improve the quality of the mesh by preserving sharp edges.
22 The user can select points on the meshed object. The points are selected by generating a 23 line from the user selection tool on top of a virtual surface. Where the line intersects the 24 surface a point is generated and displayed.
26 A line across the surface of the object will be created connecting two successive points.
27 Either the shortest path across the surface or using a plane with the third degree of 28 freedom selected by the user. The user can select as many points as desired with a 29 minimum of three points. The user can see the model in 3D and manipulate the model during the selection process.
32 C. Path generation algorithm 34 The selected surface may be segmented into smaller sub surfaces by extracting surfaces separated by edges (using an edge detection algorithm) or splitting large surface in to two 1 subclusters (and repeating this until the clusters are below a specified size -max distance 2 between any two points or similar).
4 A reference edge is either determined by a machine learning algorithm trained on simulated data to minimise total object path length and complexity or provided by the user.
6 User input is created by providing three degrees of freedom, such as two point on the 7 surface and an angle. The reference path will have a corresponding plane. At regular 8 distances along the path (user or machine learning algorithm selected) points will be 9 generated with planes generated at each point where the plane is perpendicular to the reference edge plane.
12 A separation of paths (e.g. for painting) will be generated by an algorithm based on the 13 spray cone size or may be user selected. A distance along each of the new planes across 14 the surface they intersect will be traversed until it is at the path separation. All of these new points will be linked together to generate a new path. The mid points of the new path 16 straight sections will be used to generate new planes (one for each straight section of 17 path) with the normal the vector connecting the two points making up the straight section 18 of path and the midpoint being on the plane. New paths are generated by traversing the 19 path separation distance across the intersection of the new planes and the surface. The process is repeated until the surface is covered in paths. An optimisation algorithm may be 21 used to reduce path complexity and time (e.g. reference path plane parameters clustering 22 parameters for separating the area into smaller separate surfaces).
24 D. Process simulation 26 Once the paths across the surface have been generated a simulation is run to ensure that 27 there are no collisions, (some path segments may need to be modified. This can be done 28 by allowing a number of parameters to be changed with a range including distance from 29 the surface and angle of the tool head relative to the surface. An optimisation algorithm can be used to identify new parameters that do not result in a collision where the loss 31 parameter is based on the number of points on the path that result in collisions (the 32 spacing between waypoints can be modified to ensure convergence). Where paths cannot 33 be found these are excluded from further processing and are marked for the user.
34 Optionally, the further process simulation may be performed. For painting this may include a deposition model based on the spray tip selected (either by the user or the path 1 optimisation algorithm in C.), the distance from the object and system pressure. For 2 blasting grit open loop this is based on the spray cone, distance from the surface, type of 3 blasting media and system pressure.
The velocity along the path is optimised to ensure that the required media is deposited on 6 the surface. This may be modified while executing the path based on sensor feedback.
8 E. Presentation to user for approval The user is presented with the path and associated statistics from the modelling.
12 F. Translation to robot 14 When the robot is on site the position that the robot arm needs to be located to complete the path will be calculated by identifying the transform from the robot frame to the work 16 piece frame. This will be generated using the simultaneous localisation and planning 17 system. Recommended robot base positions will be provided to the user to ensure all 18 areas can be reached after the robot has worked from, all of the recommended base 19 positions.
21 G. Path adjustment/ re-planning 23 The path may be adjusted or completely changed during the process. This will be based 24 on sensor feedback (e.g. visual data run through a machine learning algorithm to identify blast/coating quality). For surface preparation, poorly performed work can be remedied by 26 looping back over the path (automatically initiated where poor blast/coating quality is 27 detected). The velocity may be changed (e.g. for blasting where there is more rust than 28 anticipated).
H. Appending information 32 Process information may be appended to the generated path, this could include images 33 and other sensor readings.
In other words a first stage where the work area is visually inspected by the user and a 2 work order generated and approved. A second stage is the setup of the robotic 3 manipulator assembly in the work location and 3D scanning of the surrounding area to 4 assess for obstacles. A third stage is the user selects the surfaces of the work object to be treated. The robot autonomously prepares a path for movement of the robotic manipulator 6 assembly, manipulator arm and/or the end effector head to avoid potential obstacles in the 7 surrounding environment and to undertake an effective treatment of the surface. The path 8 is displayed to the user and approval requested. A fourth stage is a technician attaches an 9 appropriate end effector head, selected for the particular surface preparation application and the geometry of the surface to be treated. The robot autonomously positions itself 11 relative to the surface to be treated and begin surface preparation.
13 During the surface preparation operation the sensor system including load cell, laser, 14 range finder monitors and instructs the adjustment of the position and rotation of the robotic manipulator assembly, manipulator arm and/or the end effector head. Cameras 16 and laser in the sensor system inspect the quality of the surface preparation after or during 17 the work is performed. Once the operation is complete the user is notified and a report is 18 generated.
An inertial measurement unit 70 on the robot base enables measurements to be used to 21 dynamically adjust the robot trajectory for instability/ movement of the base. This becomes 22 an important feature if the robot is mounted on a long reaching structure such as a 23 hydraulic boom.
The position tracking system 62 can use the camera 64 or lidar based system 68 26 optionally combined with the inertial measurement unit 70 to accurate track and position 27 the robotic manipulator assembly 11. The position tracking may be the same as described 28 for the handheld controller. Optionally more than one position tracking system may be 29 used, one for rough positioning (say within a 10cm sphere -fast and very robust but low accuracy), the next more accurate position tracking system specific the workpiece using a 31 surface fitting of the original scan/ model used for path planning to the data being 32 observed from a camera system or lidar generating a point cloud. An initial guess for the 33 position may come from the first positional tracking system. The final position tracking 34 system may use laser data to provide fine alignment to the workpiece to maximise quality and consistency.
1 Information on the environment can be communicated between the robotic manipulator 2 assembly 11 and the handheld controller 60 such that the calculated position by the 3 handheld controller 60 can be used by the robotic manipulator assembly 11.
4 Data gathered by the robotic manipulator assembly 11 and the handheld controller 60 can be attached to the position of physical objects in the working environment such that useful 6 information can be displayed and accessed by the user.
8 The information can optionally be overlaid in augmented reality for the user and the user 9 can then use the handheld controller 60 to change the position of the selected point, both in distance away and point on the users' vision.
12 In use, the robotic manipulator assembly 11 is able to be precisely moved and positioned 13 using localisation visual data provided by the camera 52 and/or lidar 54. The visual input 14 data by the camera is processed by a localisation algorithm which ensures the robot to avoids collisions with the external environment and personnel.
17 In order to ensure that the manipulator and head does not collide with the environment, 18 visual images and/ or laser range measurements are used to create identify points in the 19 environment that have been confirmed to be occupied. These may be added to over time or refreshed frequently. When checking for collisions the path of the manipulator and head 21 is broken up into a series of steps, usually at constant time steps.
23 The occupied volume of the head system and the manipulator at each time step in the 24 planned path will be checked for collisions. This may be performed during the process simulation operation described in D) above. The head and manipulator may be 26 represented as a simpler geometry to improve processing time (e.g. cuboid or a series of 27 spheres of varying diameters). Path planning will use an optimisation approach where a 28 cost function will be applied to a specific path.
Once in position the individual work item is identified by a user with the handheld controller 31 60. Using input data from a range of cameras 52 and the laser-based system 54 on the 32 sensor system, the most efficient path to be taken in the surface preparation or coating 33 operation is calculated by a path planning algorithm.
It will be appreciated that the scanning operations and/or planning operations may be 2 performed some time before the work is to be carried out. The scanning operations and/or 3 planning operations may be performed hours, days, weeks, months or even years before 4 the work is to be carried out. The scanning operations and/or planning operations may be perform in the absence or presence of the robot assembly in the work location or 6 environment. This scanning operation and/or planning operation may be performed using 7 available 3D modelling or by generating 3D models using the handheld controller.
9 Alternatively the scanning operations and planning operations may be performed in the presence of the robot just before the work is due to be carried out.
11 The robotic manipulator assembly 11 is configured to conduct inspection, surface 12 preparation or coating operation to a three-dimensional surface of an object along a 13 calculated path.
The proposed path is generated virtually and sent to the user for review and approval.
16 Through this method the robot is capable of executing complex paths on a range of 17 geometries including flat areas, pipes and curved surfaces.
19 Referring to Figure 1A, when a different inspection, surface preparation or coating operation is required the desired operation may be selected on the handheld controller 60.
21 An appropriate end effector head 22 for that specific application is mounted on the end 22 effector head mount 20 of the manipulator arm 16. The end effector head 22 may be 23 mounted manually or by as part of the plan for a fabric maintenance operation.
The base may optionally include outriggers or extendable supports to provide support and 26 stability to the device. Alternatively or additionally an electromagnet 80 maybe connected 27 to the endless tracks to optionally anchor and fix the position of the endless tracks on 28 metal structures during an inspection, surface preparation or coating operation.
Although in the above example the base is mounted on endless tracks, additionally or 31 alternatively the height of the base may be vertically adjustable such that it can be raised 32 and lowered. The base may be connected to a series of sections by a geared system that 33 when a change of height is required the gears systems climb up or down the sections.
1 Alternatively the apparatus may be mounted on a rail system which is further described in 2 Figure 8A to 10B. Optionally a work basket may be installed in the assembly to enable a 3 user to inspect and support the work of the robotic manipulator assembly.
Figure 2 shows the different elements of a sensor system 50 of a robotic manipulator 6 assembly according to one embodiment of the invention.
8 The sensor system may include 2D lidar/ 2D laser profiler, 3D lidar, Load cell with one up 9 to 6 degrees of freedom, IR projector for improved imaging e.g. vertical-cavity surface-emitting laser, active stereo/ structure light camera system, laser Range finders, blast 11 attachment, spectrometer, camera -mono or stereo, inertial measurement units, 12 ultrasonic wall thickness measurement; paint pinhole/ holiday detector applied DC 13 brushes, wet film thickness sensor e.g. thermal transient analysis and ultrasonic coating 14 thickness measurement.
16 Although in this example the sensor system is described as being located or mounted on 17 the end effector head, it will be appreciated that the sensor system may be located or 18 mounted on the robotic manipulator assembly 11, manipulator arm 16 and/or the end 19 effector head mount 20. It will also be appreciated that robotic manipulator assembly 11, manipulator arm 16, end effector head mount and/or the end effector head may contain 21 different elements of the sensor system.
23 Figures 3A and 3B show an enlarged view of a manipulator arm 116 of the robotic 24 manipulator assembly 111. An end effector head 122 is reversibly mounted on the end effector head mount 120 connected to the manipulator arm 116.
27 A sensor system 150 is located on the end effector head 122. In this example the sensor 28 system has a camera 152 and a laser system 154 to ensure correct orientation of the 29 robotic manipulator assembly 111, manipulator arm 116 and end effector head 122 relative to the work location and/or work object.
31 The camera 152 generates a first data set and a laser system 154 generates a second 32 data set. The first and second data sets are processed to locate the manipulator arm 116 33 and end effector head a 122 to a high resolution or accuracy.
1 In this example the sensor system 150 includes a load cell 160 on the end effector head 2 122 to measure and confirm that the end effector head 122 is being held with the require 3 pressure against the surface where surface contact is required. Pressure data is sent from 4 the load cells to an onboard computer via electric or fibre optic cables.
6 By providing a load cell real time measurement of the load on the head can be monitored 7 which allow sufficient pressure to be placed on the equipment to ensure the surface is 8 treated but not excess pressure which would damage the equipment or prevent it from 9 moving across the surface.
11 It will be appreciated that other sensor types may be included in the sensor system 12 including cameras, lasers, a range sensor, spectrometers, wet film thickness sensors, load 13 cells, inertial measurement units, ultrasound sensors, infra-red sensors, infra-red 14 projectors proximity sensors, inductive sensors, or a lidar sensors.
16 Figure 4A is a flow diagram 200 showing an example blasting operation surface 17 preparation according to an embodiment of the invention. The system has a compressor 18 connected to blast equipment to enable grit and water to be dispensed under pressure via 19 the umbilical/hose system which is connected to the end effector head.
21 An onboard computer controls the robotic manipulator assembly platform which controls 22 the movement of the manipulator arm (robot manipulator), end effector system (including 23 the end effector head) and electromagnet (magnetic/mechanical anchor) to optionally 24 anchor and fix the position of the robotic manipulator assembly the blasting operation.
26 A sensor system controls the movement of the end effector system. The sensor system 27 comprises a camera system, a laser-based system to enables the robotic manipulator 28 assembly and the connected end effector head to be precisely positioned relative to the 29 surface of the structure or object being worked on. Optionally a spectrometer may be provided to assess and inspect the surface of the object being treated.
32 Optionally a load cell is connected to the end effector head to measure and confirm that 33 the end effector head is being held with the correct pressure against the surface where 34 surface contact is required. In the event of an over pressurisation, an ATEX over pressure system is activated.
1 The sensor system may assess or inspect the quality of the surface preparation after or 2 during the work is performed. The assessment or inspection of the quality of surface 3 preparation may be performed in real time as the surface is treated. The system may 4 repeat or amend its plan for a fabric maintenance operation based on the results of the assessment or inspection. For example, the blasting path may have been estimated using 6 an initial assessment of the level of rust (e.g. by an algorithm using visual data). If the rust 7 is worse than anticipated the path speed will be modified and sections may need to be 8 repeated.
Figure 43 is a flow diagram 220 showing an example paint operation of a prepared surface 11 according to an embodiment of the invention. The flow diagram shown in Figure 4B is 12 similar to the flow diagram shown in Figure 4A described above. However flow diagram 13 shown in Figure 4B relates to a paint operation carried out by the robotic manipulator 14 assembly. The compressor is therefore connected to a paint system to enable paint to be dispensed under pressure via the umbilical/hose system which is connected to the end 16 effector head.
18 It will be appreciated that a compressor may not be required for some paint/blasting 19 operations. For example, some tools do not require pneumatic or hydraulic systems such as air-less paint tools and electric bristle blasting tools.
22 A sensor system controls the movement of the end effector system during the paint 23 operation or blast operation. The sensor system provides data that is processed and 24 allows either adjustment of the existing path or the generation of a new path for the end effector system during the paint or blast operation. The sensor system comprises a 26 camera system, a laser-based system to enables the robotic manipulator assembly and 27 the connected end effector head to be precisely positioned relative to the surface of the 28 structure or object being worked on. A spectrometer is optionally provided to assess and 29 inspect the surface of the object being treated and optionally parameters of the paint layer applied.
32 The sensor system may assess or inspect the quality of the paint coating after or during 33 the work is performed. The assessment or inspection of the quality of paint may be 34 performed in real time as the surface is painted. The system may repeat or amend its plan for a fabric maintenance operation based on the results of the assessment or inspection.
1 Inspection may include the processing of visual data, roughness from a probe or laser or 2 contamination level based on visual, ultrasonic or spectrometer data.
4 Figure 5 shows a fabric maintenance system 300 set up for imaging a surface to be treated according to an embodiment of the invention. The system 300 is similar to system 6 10 described in Figure 1 and will be understood from the description of Figure 1. The 7 system 300 comprises a robotic manipulator assembly 311 having a base 312 mounted on 8 endless tracks 314. Each of the endless tracks are mounted on rollers 315. An 9 articulated manipulator arm 316 is pivotally mounted at first end 316a on the base 312.
The manipulator arm 316 has joints 318 which may be individually actuated to provide the 11 arm 316 with at least six degrees of freedom. The manipulator arm 316 carries an end 12 effector head mount 320 which is movably secured at a second end of the manipulator 13 arm 316b.
A variety of end effector heads may be reversibly fixed to the end effector head mount 320 16 depending on the desired inspection, surface preparation or coating operation. The end 17 effector head may comprise a plurality of sensors including cameras, lasers, inductive 18 sensors and/or ultrasonic sensors to ensure correct mapping of the surface to be treated.
19 Only one end effector head is shown in Figure 5.
21 Using the sensor system the robotic manipulator assembly 311 autonomously prepares a 22 path for movement of the robotic manipulator assembly, manipulator arm and/or the end 23 effector head to avoid potential obstacles in the surrounding environment and to undertake 24 an effective treatment of the surface. The system may create a Cartesian path plan to control the overall movement of the robotic manipulator assembly 311, manipulator arm 26 316 and/or the end effector head 322 during the treatment operation and a plan "pick and 27 place" plan of how the blasting sequence 329 should be conducted.
29 The orientation of the manipulator arm and application head are maintained at all times.
Where three or more proximity sensors are used a rotation and translation from the current 31 position to the optimal position can be calculated and implemented on all of the servos 32 with a control loop to enable a timely adjustment.
1 To identify the precise location of the robotic manipulator assembly, manipulator arm 2 and/or the end effector head a laser sensor and/or camera is used. Alternatively or 3 additionally structured light or two cameras in stereo can be used.
A projection small laser dots using a VSCEL or similar can provide texture to the stereo 6 camera such that an accurate surface geometry can be assessed.
8 An important part of this system is the continuous visual, spectrometry and/or laser surface 9 profiling by the sensor system. The visual inspection involves capturing an image and then processing it such that each pixel is compared against a reference grading system. This 11 enables the surface to be graded such that it is confirmed to meet the surface preparation
12 specification or paint specification.
14 The system may comprise a strong external light (most likely one or more LEDs of more than 50001m of a VCSEL laser illumination in the IR spectrum) to provide consistent 16 lighting for the assessment. The spectrometer will identify salt and other surface 17 contamination and based on the algorithm will identify whether more blasting/ washing is 18 required.
A 2D laser profiler (most likely using a light-plane-intersecting method that triangulates the 21 reflected light), will provide the distance to points on a 2D line projected onto the targeted 22 surface such that the surface roughness can be estimated.
24 The algorithm uses the sensor data to identify whether re-blasting is required. If it is, the area may be re-blasted and/ or washed and then retested until satisfactory results are 26 obtained. To determine whether there is any remaining corrosion, rust or contaminates.
27 The end effector can optionally be used as a handheld unit for assessment/ measurement 28 in areas that are difficult to access for the robot. The handheld unit or device may be used 29 by a user for logging data.
31 The sensor system may comprise one or more cameras which collect visual data on the 32 rust level of steel surfaces where this data is processed by an algorithm to determine the 33 grade of rust against industry standards. The sensor system may comprise a 20 laser 34 profiler to assess the level of surface preparation of steel surfaces where data collected by the laser profiler is processed by an algorithm which is used to identify any areas of the 1 surface which have not been blasted to a pre-set standard. The standard is defined by the 2 user.
4 In order to ensure that the manipulator and head does not collide with the environment, visual images and/ or laser range measurements are used to create identify points in the 6 environment that have been confirmed to be occupied. These may be added to over time 7 or refreshed frequently. When checking for collisions the path of the manipulator and head 8 is broken up into a series of steps, usually at constant time steps.
The occupied volume of the head system and the manipulator at each time step in the 11 planned path will be checked for collisions. The head and manipulator may be represented 12 as a simpler geometry to improve processing time (e.g. cuboid or a series of spheres of 13 varying diameters). Path planning will use an optimisation approach where a cost function 14 will be applied to a specific path.
16 The planned path once created may be displayed to the user for approval. The displayed 17 path may consist of a virtual representation of the environment to be worked in, including 18 the area to be treat for surface preparation and/ or painted. It may also display how the 19 robotic manipulator assembly, manipulator arm and/or the end effector head will move as well as the base of the robotic manipulator assembly Of it is mobile). Key statistics may be 21 calculated from the simulation including surface area blasted and/ or painted per unit time.
23 The number of robotic manipulator assembly/base station moves, expected volume of 24 paint, grit, dry ice along with areas that the manipulator will not be able to cover due to obstacles etc will be flagged to the user.
27 Figure 6 show a selection of different end effector heads that may be removably attached 28 to the end effector head mount 120 connected to the manipulator arm 116. The end 29 effector head 522 type may be selected depending on the type of operation required and the shape or profile of the surface to be treated.
32 As an example end effector heads 540a, 540b and 540c are configured for open loop grit 33 blasting of three different surface shapes. The end effector heads 540a, 540b and 540c 34 have a single conduit 541 to supply grit to the surface to be treated.
1 End effector heads 542a, 542b and 542c are configured for closed loop grit blasting of 2 three different surface shapes. The end effector heads 542a, 542b and 542c have a 3 supply conduit 543 to supply grit to the surface to be treated and a return line 545 for 4 waste grit and rust etc. 6 End effector heads 544a, 544b and 544c are configured for a different type of surface 7 preparation such as open loop water jetting blasting of three different surface shapes.
8 End effector heads 546a, 546b and 546c are configured for a painting structure or 9 surfaces having three different surface shapes.
11 The end effector heads may be selected to match the required surface treatment (surface 12 preparation or coating), whether it is an open or closed loop operation; and the shape or 13 profile of the surface to be treated. As an example, if a small diameter pipe is to be closed 14 looped grit blasted then end effector head 542a would be selected. If however, a flat surface was to be treated with an open looped grit blast operation then end effector head 16 540c would be selected.
18 By selecting an appropriate or corresponding end effector head shape for the surface to be 19 treated enables close contact between the end effector head and the surface. This may allow improved treatment and may allow the sensors in the sensor system to take more 21 accurate measurements. The end effector heads may have a different surface preparation 22 or painting function. Each end effector heads may have a different shape or profile or 23 structure engaging surface.
Figure 7 shows a robotic manipulator assembly 611 with an end effector head 622 26 detached from the manipulator arm 616 of the. An end effector head 622 is reversibly 27 mounted on the end effector head mount 620 connected to the manipulator arm 116.
29 A head attachment system 623 is located at the end of end effector head mount 620. The head attachment system comprises of a rough alignment guidance system 625 which can 31 correct for angular of translation offset when connecting a new or different end effector 32 head. The head attachment system allows quick connection and disconnection end 33 effector heads from the robotic manipulator assembly 611.
1 In this example the rough alignment guidance system 625 comprises three or more 2 tapered rods 627 which allow for rough then successively finer alignment of the end 3 effector head mount 622 with the end effector head 622. A load cell 640 in the end effector 4 head mount 620 can measure the force exerted by the tapered alignment system 625 which can be used with a control loop (e.g. PID loop) to correct for makeup path errors.
7 A locking mechanism 630 allows for the end effector head 622 to be locked in position.
8 This could consist of a ball bearing locking system where a cylinder is extended when air 9 pressure is applied such that ball bearings are forced down a tapered surface extending out in the process. They can then lock into a groove in the female part of the mating 11 mechanism. This mechanism could also be electrically actuated. Additionally or 12 alternatively a collect type connector, actuatable pins, balls or a simple j-slot may be used.
14 Pneumatic feedthrough conduits may be included which consist of a polymer seat and a cone tapered profile. Similarly, electrical connections can be made up though a male pin 16 and female receptacle. A seal around the male pin can be included to ensure debris is 17 excluded. Alternatively, cylindrical sprung electrical pins can be used which form a press fit 18 when the male and female parts of the connector are pulled together.
Paint, pneumatic and vacuum conduits 632 are sealed with a polymer ring and a male part 21 consisting of either a tapered polymer of metal part. The polymer ring if soft enough to 22 allow for a small amount of float to allow for misalignment.
24 To ensure electrical safety in explosive environments, relays or similar switches can be used to isolate electrical pins during the period they are exposed to the environment.
26 Electrical pins may be flushed with an insulating fluid or gas to ensure that they do not 27 cause a spark potential in explosive environments.
29 The mechanical connection for the end effector head will include connectors to one or more conduits that are mounted on the robotic manipulator assembly. These may include 31 a high pressure (expected to be up to 12 bar though possibly higher pressure) conduit for 32 carrying one or more of grit, dry ice and sponge blasting media.
1 In a closed loop system a second conduit such as a vacuum line may be included which 2 when grit or sponge media is used enables said media to be recovered minimising any 3 clean up required.
A paint line will enable paint to be applied which may also be accompanied by a flushing 6 line enabling the lines to be cleaned automatically. A water line may be included for 7 washing surfaces that do not meet the cleanliness requirements.
9 Where closed loop blasting is being used a seal against the surface being blasted needs to be created to avoid grit escaping. This will be formed by a replaceable attachment that 11 has a number of fibres extending from the head (bristles like on a brush). These fibres can 12 be bent such that a seal is preserved even if the head is moved. A number of attachments 13 will enable different surface to be blasted without media escaping, however the correct 14 attachment needs to be identified for the area to be blasted and an area may require multiple heads to be used.
17 A sensor (camera or laser -point cloud) identify local radius of curvature and then match 18 this to the required head attachment. Areas segmented based on radius of curvature. Path 19 optimised to minimise time by collecting similar areas together.
21 Figures 8A and 8B shows a perspective and side views of fabric maintenance system 700 22 according to an embodiment of the invention. The fabric maintenance system 700 and 23 robotic manipulator assembly 711 are similar to the system 10 and robotic manipulator 24 assembly 11 described in Figure 1 and its operation will be understood from the description of Figures 1 to 7 above. However, the robot manipulator assembly 711 is 26 mounted on a rail rather than a base with endless tracks.
28 In this example the rail system 720 comprises a rail 722 which is secured to a wall 29 724,with the fixings to the wall using known fixing means such as bolts into stone, brick or concrete or continuous, bolted fixings on to attachment points (e.g. tapped holes in a steel 31 structure) or welds onto a metal structure (e.g. the rail attachment points welded on to a 32 metal structure such at the side of a standard shipping container). It will be appreciated 33 that the rail 714 could alternatively be mounted to the floor or ceiling.
1 The rail system 720 comprises wheels 726 connected to a central member 728. The 2 wheels are configured to move along an internal profile of the rail 722 such as a channel or 3 track 730 of the rail 722. The rail 722 has a lip 732 configured to prevent the wheels 726 4 and central member 728 from exiting the track 730. A guide member 734 is aligned with a recess 736 in the central member 728 to maintain the orientation and alignment of central 6 member during travel along the rail 722.
8 The wheels are made of a suitable material to resist bending, tension and compression 9 loads. Multiple wheels at different angles could be used. Alternatively or additionally multiple rails may be used to provide additional degrees of freedom.
12 A support platform 740 is connected to central member 728. The robot manipulator 13 assembly 711 is mounted on or connected to the support platform. A soft outer cover may 14 be included such that dust and other debris is excluded from the assembly.
16 Figure 9A shows an alternative rail arrangement for fabric maintenance system 750 which 17 similar to the system 700 described in Figure 8A and will be understood from the 18 description of Figures 8A above. In this example the wheels 726 are configured to be 19 driven along the channel or track 730 of the rail 722 by a lead screw shown in Figure 9B. A nut 752 is connected to the support platform 740 and the lead screw 754 is connected to a 21 motor 756. As the motor turns the screw the nut and connected support platform travel 22 along the screw. It will be appreciated that other method of driving the wheel along the rail 23 may include roller screw, powered wheeled or a pneumatic or hydraulic cylinder.
In the examples described in Figures 8A to 9B wheels are configured to run along a track 26 or channel within a rail. It will be appreciated that the rail system may be configured for 27 wheels, tracks or bearings to travel over an outer surface of a rail.
29 As an example, Figure 10A and 103 show an alternative rail arrangement for fabric maintenance system 800 which similar to the system 700 described in Figure 8A and will 31 be understood from the description of Figures 8A above. However, in this example instead 32 of using wheels with a track in the rail. The rail system 820 comprises a rail 822 and a 33 robot assembly support frame 839. The robot assembly support frame comprises a 34 recirculating linear bearing 826 and a support platform 840. The robot manipulator assembly 811 is mounted on or connected to the support platform. The recirculating linear 1 bearing 826 is configured to engage an outer surface of rail 822 to move the robot 2 assembly support frame along the length of the rail 822. The recirculating linear bearing 3 could be powered or unpowered.
The invention provides a sensor system for a robotic apparatus for industrial fabric 6 maintenance of a work object in a work location. The sensor system comprises 7 a first optical system for collecting a first data set relating to a work location and/or work 8 object, a second optical system for collecting a second data set relating to a work location 9 and/or work object and at least one processing module for processing the first and second data sets. The first optical system comprises an optical camera, and the first data set 11 comprises camera imaging data. The second optical system comprises a laser positioning 12 system, and the second data set comprises laser positioning data. The sensor system is 13 operable to process the first data set to locate the robotic apparatus in relation to the work 14 location and/or work object to a first resolution or accuracy. The sensor system is operable to process the second data set to locate a robotic apparatus in relation to the work location 16 and/or work object to a second resolution or accuracy. The second resolution or accuracy 17 is higher than the first resolution or accuracy.
19 The invention provides a sensor system which allow industrial fabric maintenance of a work object in a work location in a safe and effective manner. The invention may be uses 21 in a variety of locations such as an industrial site or paint shop. It may also be used in the 22 inspection, surface preparation and coating of new equipment, equipment being 23 refurbished and/or maintained.
By providing a sensor system for a robotic apparatus data relating to the work location, 26 work object and/or the fabrication operation may be collected to enable the robot 27 apparatus to be positioned and moved at accurately relative a work object in a work 28 environment. This may allow a wide range of industrial fabric maintenance of a work object 29 to be conducted including inspection operations, surface preparation operations and/or coating operations. This may also allow fabric maintenance of a work object to be 31 conducted reliably, remotely and/or autonomously to a high standard.
33 The accurate positioning a robot apparatus and provision of a sensor system may allow 34 real time measurement and feedback on a robot end effector on the surface to be treated or inspected. This may ensure that sufficient pressure is placed on the equipment to allow 1 effective treatment or measurement whilst avoiding over pressurisation which may prevent 2 the equipment from moving across the surface.
4 Another advantage of the sensor system is the quality of the fabric maintenance operation may be assessed using the sensors in real time. This ensures that the surface meets the 6 specification required by the end user before the robot assembly is moved to a different 7 location or piece of work.
9 Another advantage of the invention is that fabric maintenance operations are often required to be conducted within complex and often congested areas. By providing a 11 sensor system for a robotic apparatus it may be positioned and moved accurately around 12 obstacles such as infrastructure or other equipment.
14 The foregoing description of the invention has been presented for the purposes of illustration and description and is not intended to be exhaustive or to limit the invention to 16 the precise form disclosed. The described embodiments were chosen and described in 17 order to best explain the principles of the invention and its practical application to thereby 18 enable others skilled in the art to best utilise the invention in various embodiments and 19 with various modifications as are suited to the particular use contemplated. Therefore, further modifications or improvements may be incorporated without departing from the 21 scope of the invention herein intended.
Claims (26)
- Claims 1. A sensor system for a robotic apparatus for industrial fabric maintenance of a work object in a work location, the sensor system comprising: a first optical sensor for collecting data relating to a work location and/or work object; a second optical sensor for collecting data relating to a work location and/or work object; at least one processing module for processing the collected data; wherein the sensor system is operable to process the data to locate the robotic apparatus in relation to the work location and/or work object to a resolution or accuracy.
- The sensor system according to claim 1 wherein the sensor system is operable to process the data to locate a robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy, wherein the second resolution or accuracy is higher than a first resolution or accuracy.
- 3. The sensor system according to claim 1 or claim 2 wherein the first optical sensor is an optical camera for collecting a first data set comprising camera imaging data.
- 4. The sensor system according to any preceding claim wherein the second optical sensor is an optical camera for collecting a second data set comprises camera imaging data or wherein the second optical sensor is a laser positioning system and a second data set comprises laser positioning data.
- 5. The sensor system according to any preceding claim wherein the industrial fabric maintenance is inspection operations, surface preparation operations and/or coating operations.
- 6. The sensor system according to any preceding claim wherein the sensor system is configured to locate, orientate, control and/or position the robotic apparatus, or at least a part thereof, during, before or after a fabric maintenance operation.
- 7 The sensor system according to any preceding claim wherein the sensor system comprises at least one further sensor, wherein the at least one further sensor is selected from the group consisting of a camera, laser, a range sensor, spectrometer, wet film thickness sensor, load cell, inertial measurement unit, ultrasound sensor, infra-red sensor, proximity sensor, inductive sensor, or a lidar sensor.
- 8. The sensor system according to any preceding claim wherein the sensor system, or at least a part thereof, is mounted on the robotic apparatus.
- 9 The sensor system according to any preceding claim wherein the sensor system, or at least a part thereof, is mounted on a functional module of the robotic apparatus.
- 10. The sensor system according to any preceding claim wherein the at least one processing modules uses an existing plan or generates a plan to locate, orientate, control and/or position the robotic apparatus, or at least a part thereof during a fabric maintenance operation.
- 11. The sensor system according to any preceding claim wherein the at least one processing modules processes the collected data using at least two algorithms or at least two independent parts of one algorithm wherein a first algorithm or a first part of an algorithm processes the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
- 12. The sensor system according to claim 11 wherein a second algorithm or a second part of the algorithm processes the data to locate the robotic apparatus in relation to the work location and/or work object to provide a second resolution or accuracy or wherein the second algorithm or a second part of the algorithm processes the data to locate the robotic apparatus in relation to the work location and/or work object to augment the resolution or accuracy of the first algorithm or a first part of the algorithm by combining information from the second algorithm or a second part of the algorithm with that from the first algorithm or first part of the algorithm.
- 13. The sensor system according to claim 12 or 13 wherein the algorithms are combined together such that they are sub-algorithms in a larger algorithm.
- 14. The sensor system according to claim 10 wherein data from the first optical system, second optical system and/or at least one sensor is processed to follow a movement sequence which defines the movement path for a functional module of the robotic apparatus.
- 15. The sensor system according to claim 9 or 10 wherein the functional module is selected from the group consisting of an inspection system, a surface treatment system or a coating system.
- 16. The sensor system according to claim 15 wherein the surface treatment system is selected from the group consisting of a water jetting system, a dry ice blasting system and an abrasive blasting system.
- 17. The sensor system according to claim 15 wherein the coating system is a paint system
- 18. The sensor system according to any preceding claim wherein the sensor system is configured to undertake fabric maintenance tasks remotely and/or autonomously.
- 19. The sensor system according to any preceding claim wherein the sensor system is configured to assess the operation of the functional module and/or the quality of the fabric maintenance.
- 20. The sensor system according to any preceding claim wherein the sensor system is configured to avoid obstacles and collisions with the external environment and/or personnel.
- 21. A method of operating a robotic apparatus for industrial fabric maintenance of a work object in a work location, the method comprising: providing a robotic apparatus with a sensor system, the sensor system comprising: a first optical sensor for collecting data relating to a work location and/or work object; a second optical sensor for collecting data relating to a work location and/or work object; at least one processing module for processing collected data; and processing the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy.
- 22. The method according to claim 21 comprising processing the data to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy; wherein the second resolution or accuracy is higher than the first resolution or accuracy.
- 23. The method according to claim 21 or claim 22 wherein the first optical sensor comprises an optical camera providing camera imaging data as a first data set and wherein the second optical sensor comprises an optical camera providing camera imaging data as a second data set or wherein the second optical sensor comprises a laser positioning system providing laser positioning data as a second data set.
- 24. The method according to any one of claims 21 to 23 comprising locating, orientating, controlling and/or positioning the robotic apparatus, or at least a part thereof, in response to the processed data.
- 25. The method according to any one of claims 21 to 24 processing the collected data using at least two algorithms or at least two independent parts of one algorithm wherein a first algorithm or a first part of an algorithm processes the data to locate the robotic apparatus in relation to the work location and/or work object to a first resolution or accuracy and wherein a second algorithm or a second part of the algorithm processes the data to locate the robotic apparatus in relation to the work location and/or work object to a second resolution or accuracy or wherein the second algorithm or a second part of the algorithm augments the resolution or accuracy of the first algorithm or a first part of an algorithm by combining information from the second algorithm or a second part of an algorithm with that from the first algorithm or first part of an algorithm.
- 26. The method according to any one of claims 21 to 25 comprising monitoring the movement of the robotic apparatus during, before and/or after a fabric maintenance operation and/or comparing the location of the robotic apparatus, or at least a part thereof, with a plan.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GBGB1911458.6A GB201911458D0 (en) | 2019-08-09 | 2019-08-09 | Fabric maintenance sensor system |
GBGB1911466.9A GB201911466D0 (en) | 2019-08-09 | 2019-08-09 | Fabric maintenance system |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202012375D0 GB202012375D0 (en) | 2020-09-23 |
GB2589419A true GB2589419A (en) | 2021-06-02 |
Family
ID=72470550
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2012375.8A Withdrawn GB2589419A (en) | 2019-08-09 | 2020-08-10 | Fabric maintenance sensor system |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2589419A (en) |
WO (1) | WO2021028673A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022100389A1 (en) | 2022-01-10 | 2023-07-13 | Bayerische Motoren Werke Aktiengesellschaft | System and method for recording application data in real time |
DE102022000701A1 (en) | 2022-02-25 | 2023-08-31 | Visevi Robotics GmbH | Autonomous manipulation system for maintenance and inspection work on track systems |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112934543A (en) * | 2021-01-27 | 2021-06-11 | 武船重型工程股份有限公司 | Spraying robot |
SE2151621A1 (en) * | 2021-12-25 | 2023-06-26 | Husqvarna Ab | Improved navigation for a robotic work tool system |
CN114603562B (en) * | 2022-04-19 | 2024-04-30 | 南方电网电力科技股份有限公司 | Distribution network electrified lead connecting device and method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0263952A2 (en) * | 1986-10-15 | 1988-04-20 | Mercedes-Benz Ag | Robot unit with moving manipulators |
GB2264569A (en) * | 1992-02-13 | 1993-09-01 | Honda Motor Co Ltd | Wheel mounting robot |
US20070216332A1 (en) * | 2003-10-20 | 2007-09-20 | Georg Lambert | Method for Effecting the Movement of a Handling Device and Image Processing Device |
US20090057373A1 (en) * | 2007-08-30 | 2009-03-05 | Gm Global Technology Operations, Inc. | Multi-Purpose End Effector for Welder |
WO2016160930A1 (en) * | 2015-03-30 | 2016-10-06 | Google Inc. | Imager for detecting visual light and infrared projected patterns |
WO2017015105A1 (en) * | 2015-07-17 | 2017-01-26 | Apex Brands, Inc. | Vision system with automatic calibration |
US20170043477A1 (en) * | 2015-08-10 | 2017-02-16 | Fanuc Corporation | Robot system with visual sensor and a plurality of robots |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4404183A3 (en) * | 2016-12-23 | 2024-09-25 | Gecko Robotics, Inc. | Inspection robot |
US10814480B2 (en) * | 2017-06-14 | 2020-10-27 | The Boeing Company | Stabilization of tool-carrying end of extended-reach arm of automated apparatus |
-
2020
- 2020-08-10 GB GB2012375.8A patent/GB2589419A/en not_active Withdrawn
- 2020-08-10 WO PCT/GB2020/051904 patent/WO2021028673A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0263952A2 (en) * | 1986-10-15 | 1988-04-20 | Mercedes-Benz Ag | Robot unit with moving manipulators |
GB2264569A (en) * | 1992-02-13 | 1993-09-01 | Honda Motor Co Ltd | Wheel mounting robot |
US20070216332A1 (en) * | 2003-10-20 | 2007-09-20 | Georg Lambert | Method for Effecting the Movement of a Handling Device and Image Processing Device |
US20090057373A1 (en) * | 2007-08-30 | 2009-03-05 | Gm Global Technology Operations, Inc. | Multi-Purpose End Effector for Welder |
WO2016160930A1 (en) * | 2015-03-30 | 2016-10-06 | Google Inc. | Imager for detecting visual light and infrared projected patterns |
WO2017015105A1 (en) * | 2015-07-17 | 2017-01-26 | Apex Brands, Inc. | Vision system with automatic calibration |
US20170043477A1 (en) * | 2015-08-10 | 2017-02-16 | Fanuc Corporation | Robot system with visual sensor and a plurality of robots |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022100389A1 (en) | 2022-01-10 | 2023-07-13 | Bayerische Motoren Werke Aktiengesellschaft | System and method for recording application data in real time |
DE102022000701A1 (en) | 2022-02-25 | 2023-08-31 | Visevi Robotics GmbH | Autonomous manipulation system for maintenance and inspection work on track systems |
Also Published As
Publication number | Publication date |
---|---|
WO2021028673A1 (en) | 2021-02-18 |
GB202012375D0 (en) | 2020-09-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2589419A (en) | Fabric maintenance sensor system | |
GB2589418A (en) | Fabric maintenance system and method of use | |
US10718119B2 (en) | Automated drywall sanding system and method | |
US11579097B2 (en) | Localization method and system for mobile remote inspection and/or manipulation tools in confined spaces | |
US10864640B1 (en) | Articulating arm programmable tank cleaning nozzle | |
US10011012B2 (en) | Semi-autonomous multi-use robot system and method of operation | |
US10449619B2 (en) | System for processing a workpiece | |
US10576627B1 (en) | System and method for inspection and maintenance of hazardous spaces with track and roller | |
CA2554992A1 (en) | Cost effective automated preparation and coating methodology for large surfaces | |
US11731281B2 (en) | Automation in a robotic pipe coating system | |
KR20210018107A (en) | Exterior wall maintenance apparatus | |
EP2994248A1 (en) | Multifunction robot for maintenance in confined spaces of metal constructions | |
US20190134820A1 (en) | Tank Cleaner | |
CN118144946A (en) | Cleaning and detecting device and method for multi-wall aviation cockpit with special-shaped angle steel | |
Mateos et al. | Automatic in-pipe robot centering from 3D to 2D controller simplification | |
Paul et al. | A robotic system for steel bridge maintenance: Field testing | |
JP6735316B2 (en) | Surface treatment equipment | |
Salunke et al. | Pipe cleaning robot | |
CN117140543A (en) | Intelligent water-cooled wall climbing maintenance operation robot and working method thereof | |
US12172266B1 (en) | System and method for media blasting a workpiece | |
US11745309B1 (en) | Remotely operated abrasive blasting apparatus, system, and method | |
Mende et al. | Environment modeling and path planning for a semi-autonomous manipulator system for decontamination and release measurement | |
Notheis et al. | Towards an autonomous manipulator system for decontamination and release measurement | |
TWI801805B (en) | Coating system and its application method | |
Kesler | Assessment of a velocity-based robot motion planner for surface preparation with geometric uncertainty |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |