WO2007066166A1 - Method and system for processing and displaying maintenance or control instructions - Google Patents
Method and system for processing and displaying maintenance or control instructions Download PDFInfo
- Publication number
- WO2007066166A1 WO2007066166A1 PCT/IB2005/003709 IB2005003709W WO2007066166A1 WO 2007066166 A1 WO2007066166 A1 WO 2007066166A1 IB 2005003709 W IB2005003709 W IB 2005003709W WO 2007066166 A1 WO2007066166 A1 WO 2007066166A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- equipment
- image
- control
- view
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 126
- 238000012423 maintenance Methods 0.000 title claims abstract description 57
- 238000012545 processing Methods 0.000 title claims description 44
- 230000008569 process Effects 0.000 claims abstract description 47
- 238000004590 computer program Methods 0.000 claims abstract description 13
- 239000002131 composite material Substances 0.000 claims description 39
- 239000003550 marker Substances 0.000 claims description 16
- 230000003190 augmentative effect Effects 0.000 claims description 15
- 238000004891 communication Methods 0.000 claims description 14
- 238000004519 manufacturing process Methods 0.000 claims description 14
- 238000012544 monitoring process Methods 0.000 claims description 12
- 230000009471 action Effects 0.000 claims description 10
- 239000011521 glass Substances 0.000 claims description 8
- 230000003287 optical effect Effects 0.000 claims description 6
- 230000008439 repair process Effects 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 claims description 4
- 230000001419 dependent effect Effects 0.000 claims description 4
- 230000004044 response Effects 0.000 claims 2
- 230000004913 activation Effects 0.000 claims 1
- 230000001976 improved effect Effects 0.000 description 27
- 238000010586 diagram Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 6
- 230000015654 memory Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 4
- 238000007689 inspection Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000003213 activating effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000004886 process control Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000008093 supporting effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000013024 troubleshooting Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/409—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/4183—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32014—Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35482—Eyephone, head-mounted 2-D or 3-D display, also voice and other control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35495—Messages to operator in multimedia, voice and image and text
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P80/00—Climate change mitigation technologies for sector-wide applications
- Y02P80/10—Efficient use of energy, e.g. using compressed air or pressurized fluid as energy carrier
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Definitions
- This invention relates to an improved method and system for providing, processing and displaying maintenance and/or control instructions for an equipment, plant or a process by means of accessing data and stored information and displaying information relevant to the device or process.
- the invention provides improved means to generate and display instructions necessary to configure, maintain or repair equipment.
- a traditional " automation system is one in which each physical asset is described in a database. In its turn, this database is
- An installation may also include equipment from different suppliers and from different industries. It is a complex and difficult task to retrieve information from all of those systems so as to give a reliable and extensive picture even on a historical basis. It is also very difficult, time consuming and error prone to organize, enter, maintain and retrieve information related to a specific device. It is even more difficult to retrieve and or access such information when an alarm or other event is reported.
- the real world object may be a single device, an object in a process or complete equipment.
- augmented reality image may be overlaid on a real world image.
- world/augmented reality data may be carried out to provide an interaction with a control device or system.
- one or more embodiments of the invention provide an improved method for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for
- device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising identifying a said equipment, making and processing an image of a first view of the said equipment, making, retrieving or recording an annotation of control or maintenance instructions for said equipment and arranging the annotation represented as a graphic element for a display, and by combining the first view of the image and the annotation into a composite image, and displaying the composite image of said equipment combined with additional control or maintenance instructions to a user.
- an improved method for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for monitoring and control, wherein device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising recording an annotation and wherein the annotation for said equipment is placed in a selected position relative to the first view of the image .
- an improved method for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for monitoring and control, wherein device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising generating the annotation as a part of a layer for superposition on the first view of the image.
- an improved method for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for monitoring and control, wherein device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising arranging the annotation displayed relative a position on first view of the image or relative a position on the display screen.
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising displaying an instance of the composite image
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising recording an annotation which provided by an input action of a second user or expert such as manipulating a second instance of the composite image displayed on the second display.
- an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising adding one or more annotations are added to the composite image and displaying the image to a plurality of other users .
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising in part providing control or maintenance instructions used to configure a set point or switch a device on or off by means of software means of part of the composite image.
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising in part activating a software means and thereby providing control or maintenance instructions used to generate one or more control signals.
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising identifying said equipment by image processing a unique marker means on or near said equipment .
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising identifying said equipment by image processing images provided by the optical means or by processing the images in part by scanning natural features of the images.
- an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising changing the position and/or orientation of the annotation in the composite image relative a distance or size relationship in a second view of the image dependent on a distance or size relationship determined with an image of a first view.
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising changing both the position and size of the annotation relative to, and depending on, a second view of the image dependent on the size relationship determined with the first view.
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising identifying said equipment by processing
- an improved method for processing and displaying control or maintenance instructions for an apparatus, the method comprising generating instructions and information for carrying out a repair, re-configure, re-programming or replacement of a faulty part of said equipment, plant or process.
- one or more embodiments of the invention provide an improved system for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for
- control system comprising an AR system comprising a tracking means, a display means, at least one computer means, a data communication means for communicating an image of a first view of said equipment, and computer software image processing means for identifying the said equipment in the industrial process or facility, and computer software means to record control or maintenance instructions comprising any of text, video or voice messages and computer software for attaching said message information, and computer software means to provide a composite image of an image of said equipment combined with additional control or maintenance instructions to a user.
- AR system comprising a tracking means, a display means, at least one computer means, a data communication means for communicating an image of a first view of said equipment, and computer software image processing means for identifying the said equipment in the industrial process or facility, and computer software means to record control or maintenance instructions comprising any of text, video or voice messages and computer software for attaching said message information, and computer software means to provide a composite image of an image of said equipment combined with additional control or maintenance instructions to a user.
- the invention comprises an improved system for processing and displaying an augmented reality representation of computer- generated graphical information, generated and stored in advance, overlaid the real world.
- Such an improved system may also comprise a communication system supporting voice
- such an improved augmented reality system may comprise: a handheld interacting and pointing device with a tracking system for determining its position and orientation in relation to a world coordinate system and a portable display device eg preferably a wearable device such as glasses, head- mounted display or head-up display or else a PDA, notebook, tablet computer, or similar, for visualizing augmented reality overlaid the view of the real world.
- the display device further comprises: a camera, such as a video camera of some sort for capturing a stream of images of the environment mounted on or integrated with the display device.
- the camera is mounted in a fixed position at the display device, and the display device is preferably located along the camera view axis and at the camera's image plane.
- the system combines computer-generated graphics and annotations with the live video stream and projects the combined augmented reality video onto the display device.
- the user sees the video with overlaid virtual information as if he was looking at the real world.
- the computer-generated graphics are registered directly onto the display device and follow, on the image viewed, and on the user's view of the real world.
- the virtual graphics are overlaid on the optical image of the real world without including a video the real world.
- One or a series of still images may be used to carry out the same or similar steps or actions with the stream of video images described above .
- unique identity markers may be placed at objects (equipment, devices and systems) in the environment. These are preferably processed by a recognition system for recognizing the unique IDs and use those IDs to identify a known device or equipment controlled by a control system. Further a tracking system is included for determining position and orientation of a tracking device in relation to a world coordinate system. As well, or as an
- a recognition system for recognizing and tracking one or more visible objects at a physical location may be used.
- a vision-based tracking system using a recognition or tracking technique known as Natural Feature tracking may be used.
- Natural Feature tracking For more detail about a type of natural feature tracking see for example an article entitled: Natural Feature Tracking for Extendible Robust Augmented Realities, Park, You & Neuman, UCLA, in a paper for the first International Workshop on
- Augmented Reality June 1998.
- This type of technique may be used with image processing to recognise one or more objects in the real world and supply information about viewer-to-object distance, and changes in that distance, so that a position of the user in relation to certain real world objects may be determined. The determinations or calculations may then be used provide a graphic display depending in part on how the position of the user and the orientation of the user changes relative to a given real object.
- the processing of the image may be speeded up by combining a calculation of a movement path of the camera/viewer based on changes in the viewer-to-object distance for one or more objects. Calculations are advantageously carried out by
- a major advantage of the present invention is that a maintenance action to respond to a new alarm or other event may be handled in a more timely way.
- Collaboration between and contact with internal users, technicians, operators and/or engineers with known and recorded technical information and/or expertise may be carried out quickly and effectively.
- the engineer, operator or other user has a portable display device or portable computing and display device. This may comprise devices such as a PDA, as a head-up display arranged as glasses or on an adapted helmet.
- a camera or other position tracking means is mounted on or near the glasses, so capturing quite closely the user's view of the equipment, including the user's position and orientation, etc.
- Communication is preferably carried out using a head-set, mobile phone, phone and hands-free type attachment so that the user may communicate with other users, experts, and at the same time have one or more hands free for manipulating the display and or making an adjustment to a control panel or component etc. of the equipment, plant or process.
- Annotations may be made in real time to information, text or graphics displayed to each user and or expert collaborating over the issue. Annotations or
- attachments in the form of text annotations, digital files, video clips, sound clips may be attached to a shared electronic document or notice board, shared display screen means, virtual white board etc.
- the information retrieved and/or developed during collaboration provides instruction to the user for monitoring, troubleshooting, configuring or repairing the equipment .
- the method and system are facilitated by one or more computer programs in the control system and a computer architecture with which each user may be configured in a database, log-in file, table or similar to record one or more types of technical expertise he/she is known to possess and one or more types of communication means to the expert.
- a software entity representing a component part of the equipment, plant or process in the control system is selected, a second software entity may be activated to access and retrieve information, especially contact information, phone numbers, IP phone numbers,
- FIGURE 1 shows a schematic diagram of an overview of a system to generate control and/or maintenance instructions for an
- FIGURE 2 is a schematic diagram overview of a system to generate control and/or maintenance instructions for an industrial device or process according to another embodiment of the invention.
- FIGURE 3 is a schematic diagram of a screen display showing a selection on a composite image provided to generate control and/or maintenance instructions for an industrial device or process according to an embodiment of the invention;
- FIGURE 4 is a schematic diagram of a screen display showing a selection on a composite image provided to generate control and/or maintenance instructions for an industrial device or process according to another embodiment of the invention
- FIGURE 5 is a flowchart for a method for an improved way to access, annotate and display information for generating control and/or maintenance instructions for an industrial device or process according to an embodiment of the invention
- FIGURE 6 is a flowchart for a method for an improved way to access, share, collaborate and display shared information for generating control and/or maintenance instructions for an industrial device or process according to another embodiment of the invention.
- FIGURES 7-9 shows a schematic diagram of a view of another screen display showing how an annotation may be fixed relative a point on a first image, according to an embodiment of the invention.
- FIGURE 10 shows a schematic diagram of a view of a screen display showing how an annotation may be displayed as fixed relative a point on a display screen.
- Figure 1 shows an overview for an improved system according to an embodiment of the invention. It shows a user 1, a device or equipment 7 to be monitored, repaired or controlled, and a display 3.
- the user holds a visible pointer 4 of some sort and a wearable computer 2.
- the user has a camera 6 of some sort preferably arranged mounted so as to record an image from the position and orientation direction that the user is looking in.
- the user has the display device arranged in the form of a head- up display 3, glasses or video glasses.
- a schematic view as seen by user 1 of the composite image 5 provided in display device 3 is shown. Signal, data and image information is
- the view seen by the user comprises an image, or a composite image, of a first image 7' of the real world equipment and a view 4' of the pointer 4 relative to the image 7' of the real world equipment.
- any of a text 9a, a diagram 9b or an annotation 9c may be superimposed over the image 7' of the equipment.
- the composite image 5 displayed to user 1 may
- Figure 2 shows schematically a user 1, a wearable computer 2, a first display device 3a and a second display device 3b.
- the equipment object 7, the device or process of interest, is shown arranged with a marker 8 for the purpose of identifying it.
- the user may have a camera integrated with the first display 3a worn by the user 1 as a monocular viewer, glasses or head-up display.
- the computing functions may be provided by the wearable computer 2, by the second or alternative display 3b, the glasses or head- up display 3a, and/or may be distributed among these devices.
- Figure 3 shows schematically another embodiment of the improved system.
- the figure shows the composite image 5 seen by user 1 comprising a view of the equipment 7 of interest.
- the view shows an image 4' of the pointer 4 held by the user displayed relative the equipment.
- Information, retrieved information 9a, 9b and added annotations 9c from one or more other users or experts may each be superimposed on any part of the composite image of the equipment.
- the annotations may be configured or arranged to be fixed relative a part of the first view of the equipment. In this case, the annotations move and change size or even
- annotations may otherwise configured or arranged.
- annotations may be in a fixed position relative a part of the computer display frame or apparatus screen. In this case, the or certain of the annotations stay on screen
- annotations are always available for the user to see.
- a virtual control panel or an on-line parameter reading may be continuously displayed even when the user moves his head and looks at other views .
- processing a unique marker 8 may be retrieved from data storage accessible by the control system, generated, and provided superimposed as an annotation on the image 7' of the equipment.
- VCP virtual control panel
- the user sees the virtual control panel (VCP) as if it were attached to the real world equipment 7 because his head position is tracked by AR tracking means such as camera 6 and the image recorded and displayed to the user is adjusted accordingly.
- the VCP may be arranged always fixed to a selected part of the equipment image, or may be arranged visible on the display while the user orientation and position change.
- the virtual control panel 30 comprises a plurality of input or control elements, which may include a mode or status indicator 33, part name or number 34, one or more parameters or set-points 35, selection or activity indicators 88 and control means or control buttons 37.
- the virtual control panel may have a Status or Mode set to On Line, such that real time values for parameters such as % load, case temperature, rotational speed in rpm and supply current in amps etc are displayed.
- Other Modes may be selected such as for example Modes such as Off Line, On Line or Trend.
- Mode Trend provides for data retrieval from the control system and presents historical data or calculates predicted data or trends
- the user 1 may monitor and inspect a condition of an equipment 7 using an embodiment of the improved system.
- the user selects an equipment, which is identified in the control system by means of a marker 8 or alternative means, for example scanning methods as described below.
- User 1 may then retrieve information relevant to the identified equipment by means of the control system 12, stored in different formats in one or more databases 14, and examine the information.
- the user, an online expert or another person may record annotations by voice or text or freehand notes and attach those annotations 9c.
- FIG. 3 shows schematically attached text information 9a, attached freehand notation 9b, attached diagram or flowchart 9c.
- the attached information which may have been contributed by other experts or users, called here annotations, may be used by user 1 as instructions or additional to carry out actions such as: carry out an inspection, adjust a set point, control a device or process, switch a device on or off. These actions are carried out by the user 1 by means of switching or operating buttons 37 or selection means 38 on the virtual control panel 30.
- a user may then inspect a condition of a motor and access or retrieve examine real time values for parameters such as speed, temperature, and load.
- the user may make control inputs and or actions using a known industrial control or process control system to input set-points, switch a device on or off etc..
- a method of an embodiment of the invention may comprise that the user 1 receives technical information or additional technical information from a local or internal colleague relevant to a device, apparatus or problem that the person is trained for or responsible for.
- the method according to an embodiment of the invention comprises the steps of that a maintenance person requiring technical information picks up or preferably puts on a user AR equipment, which comprise be a PDA, wearable computer, headset based device, comprising a display 3, 3a, 3b.
- a maintenance person requiring technical information picks up or preferably puts on a user AR equipment, which comprise be a PDA, wearable computer, headset based device, comprising a display 3, 3a, 3b.
- the control system 12 may generate information instructing the logged on maintenance person where in the plant to go and what the problem is (the system may indicate a new alarm via text, a visual or graphic display, or text-to- speech (TTS) .
- TTS text-to- speech
- the maintenance person or user can ask for more info via text or voice.
- the maintenance person goes to a physical location and or a location of a plant or process section (functional location) indicated by the system, observes the problem, alarm or other event.
- the AR equipment may be used to recognise an equipment of interest 7 by means of: an asset number or similar, a wireless node mechanism, a marker 8 installed for the purpose of identifying the equipment to an AR system, a method of scanning equipment surroundings and processing the image to identify natural features to match to a predetermined location of the equipment, or a combination of equipment identity in the incoming alarm information together with any of the above.
- an asset number or similar a wireless node mechanism
- a marker 8 installed for the purpose of identifying the equipment to an AR system
- a method of scanning equipment surroundings and processing the image to identify natural features to match to a predetermined location of the equipment or a combination of equipment identity in the incoming alarm information together with any of the above.
- the maintenance person can view service history and
- the maintenance person If the maintenance person cannot fix the problem on his own, and needs help/support, he activates or mouse-clicks software means in the control system associated with the device or problem of interest.
- the system finds an appropriate online expert (where navigation options may include using a history of previous similar problems or a maintenance history of the selected device, process) or the user may select a specific person, and contact is then established by computer and/or by phone in a suitable way.
- the information stored on the specific person or expert includes a unique address, internally with IP addresses and workstation addresses of users internal to the plan, and, preferably for external experts an IP address, URL, phone number etc of a support database, preferably an ABB database, and information about the local system to be used in support requests.
- the second person agrees to accept the support request and receives access to the alarm or event, and gets an overview of at least the latest event as a starting point.
- Voice contact may be established automatically, for example via voice-over-IP (VoIP) .
- Options are provided for application sharing and/or to send pictures or files, and in particular for the selected specific person to share the composite image 5 seen by the user, and to add notes, comments, other information which are attached to the composite image as annotations.
- VoIP voice-over-IP
- Options are provided for application sharing and/or to send pictures or files, and in particular for the selected specific person to share the composite image 5 seen by the user, and to add notes, comments, other information which are attached to the composite image as annotations.
- the second person depending on user log-in information and/or authority or
- privilege data be enabled to also control the on-screen pointer image 4' by any computer input means suitable for selecting a control element of the VCP displayed by a graphical user
- Annotations may include the possibility of attaching video, voice or pictures to a support request, and retrieving a "buddy- list" to give an overview of available local experts ( Figure 5) . It may be desirable to add some logic to the list so that local contacts with knowledge about a specific problem are given priority, and so that the logic establishes which users or experts are now available, logged in, on call, suitable but not available, when available and why.
- An operational difference exists between local experts and remote (eg ABB Support)
- Figure 5 shows a first flowchart for a method which comprises the following steps: 51 User is notified by control system 12 in some way of an alarm or event
- User may retrieve and examine a list of internal operators, experts, and/or external experts with any of whom the user may choose to seek information or instructions for controlling an equipment or process .
- Figure 6 shows a second flowchart for a method which comprises the following steps:
- Figure 4 shows another embodiment of the improved system.
- the figure shows another composite image 5 seen by user 1 comprising a view of the equipment 7 of interest.
- the view shows an image 4' of the pointer 4 held by the user displayed relative the equipment.
- Information, retrieved information 9a, 9b and added annotations 9c from one or more other users or experts may each be superimposed on the composite image of the equipment.
- the view of the equipment 7 of interest includes a unique marker 8 positioned in the view and, in this case, positioned on the actual equipment itself.
- the marker may be identified by using an image processing software to detect presence of a marker and identity of the marker on the equipment of interest.
- a virtual control panel 30 for the equipment 7 as in the control system identified by marker 8 may be retrieved from data storage, generated, and provided as superimposed on the image 7' of the equipment.
- the virtual control panel 30 comprises a plurality of control elements, which may include a mode or status indicator 33, part name or number 34, one or more parameters or set-points 35, selection or activity indicators 88 and control means or control buttons 37.
- the virtual control panel may have a Status or Mode set to Edit, such that set-point values for motor parameters such as: case temperature,
- the figure also represents an example of such method steps, in that a first 9b annotation was provided giving instructions, in this case for the user to "select Edit" mode, and a dotted
- Pointer image 4' with solid lines indicates pointer image 4' positioned on a control element and activating the control element.
- Modes such as Off Line or Trend which retrieves from the control system and presents historical data or predicted data respectively for the equipment .
- the virtual or augmented information is annotated and linked to the first view 7 of the equipment.
- the equipment has been described as identified in this case by a unique marker 8 arranged on or near the equipment, which is then matched with stored data about process devices, equipment etc..
- an annotation 9d is associated with part of of a control panel in a first image 7 of the equipment.
- the view 17 of the equipment has changed, but the annotation is still fixed to the same part of the equipment image, despite that the view or orientation of the user has changed.
- the equipment view 27 is
- Figure 10 illustrates an annotation form
- Figure 8 arranged fixed relative the screen display, so that as the user changes orientation and position the annotation remains in view in the same place.
- the annotation 9 in the augmented reality view of the composite image equipment 7', I' 1 may not be associated to a selected part of the equipment or to any part of the displayed image and may instead "float" or remain in view indefinitely even when the user changes position and orientation completely. Thus an annotation may remain on the display even when the user goes to another part of the plant, if the user so desires .
- the unique marker is identified by means other than by image processing to recognise a sign, symbol, number or other visual marker.
- Non-visual markers such as IR readable signs, or markers using other means such as acoustic transmissions may be adapted for use.
- Equipment identity may also be established using one or more purpose-specific wireless transmitters in the vicinity of the equipment of interest, or by using wireless communications based on wireless transmitters forming part of a data network in the vicinity.
- the equipment is identified for tracking purposes and tracked relative a view of the real world using an optical mapping technique that involves identifying a series of objects or features in two or more repeated images taken by a camera or video camera of the equipment and its surroundings.
- An image processing software is used to identify objects detected optically, track the same objects over a succession of images, and calculate a position and orientation of the camera or sensor receiving and taking the images. In this way, the position and orientation of the user with a tracking apparatus 6, figures 1, 2 can be calculated.
- the image processing may be based on recognising and natural features, especially features with relatively higher contrast to their background, and subsequently calculating, based on the change in a distance between those natural features in an image, a position of the viewer relative the equipment or other object. Changes in the position and size of the annotation relative the equipment due to any change in position or orientation of the user's head are calculated in this way and the size and position of the annotation relative the equipment (see Figures 7-9) is displayed accordingly.
- the image processing may be based in part on a Natural Features method.
- This type of method concerns identifying points on an object or in a surrounding, points such as edges or corners that show some visual contrast.
- an image processing program tracks identified points in one or more successive frames. From these identified points and the paths the points move along the position and orientation of the scanner or camera is calculated.
- This position data is in turn used to determine, based on position or location data stored for the various process devices, equipment in the control system, which equipment in the plant is being looked at or inspected etc on the basis of the position so estimated. By this means the equipment can be identified without the use of a identifying marker when stored position data for devices, equipment at the location are available
- the selected part of the equipment of interest is identified and tracked in a view of the real world by measuring a distance to one or more wireless node means arranged on or near said equipment.
- Transmissions from one or more wireless nodes comprised, for example, in a wireless LAN or in an ad-hoc wireless network, in a Bluetooth pico-net or a ZigBee network may be used to calculate a relative physical location based on signal strength of other radio signal
- the equipment may be mapped to location data stored in a control system of the industrial plant.
- the methods of the invention may be carried out by means of one or more computer programs comprising computer program code or software portions running on a computer, data server or a processor.
- the microprocessor (or processors) comprises a central processing unit CPU performing the steps of the method according to one or more facets of the invention, such as the methods shown in Figures 5, 6.
- the methods are performed with the aid of one or more said computer programs, which are stored at least in part in memory accessible by the one or more processors.
- a program or part-program that carries out some or all of the steps shown and described in relation to in Figures 5 and 6 may be run by a computer or processor of the control system.
- At least one of the or each processors may be in a central object oriented control system in a local or distributed computerised control system.
- said computer programs may also be run on one or more general purpose industrial microprocessors or computers instead of one or more specially adapted computers or processors .
- the computer program comprises computer program code elements or software code portions that make the computer perform the method using equations, algorithms, data, stored values and
- a part of the program may be stored in a processor as above, but also in a ROM, RAM, PROM, EPROM, or EEPROM chip or similar memory means.
- the program in part or in whole may also be stored on, or in, other suitable computer readable medium such as a magnetic disk, CD-ROM or DVD disk, hard disk, magneto-optical memory storage means, in volatile memory, in flash memory, as firmware, stored on a data server or on one or more arrays of data servers .
- Other known and suitable media, including removable memory media such as removable flash memories, hard drives etc. may also be used.
- Data may also be communicated wirelessly, at least in part, to portable devices carried or worn by a user.
- communications may be carried out using any suitable protocol, including a wireless telephone system such as GSM or GPRS.
- a wireless telephone system such as GSM or GPRS.
- Short range radio communication is a preferred technology, using a protocol compatible with, standards issued by the Bluetooth Special Interest Group (SIG), any variation of IEEE-802.11, WiFi, WiMax, Ultra Wide Band (UWB), ZigBee or IEEE-802.15.4, IEEE-802.13 or equivalent or similar.
- Wireless communication may also be carried out using Infra Red (IR) means and protocols such as IrDA, IrCOMM or similar.
- IR Infra Red
- Wireless communication may also be carried out using sound or ultrasound transducers, through the air or via work object construction, pure magnetic or electric fields (capacitive or inductive communication) or other types of light, such as for example LED, laser, as communication media with standard or proprietary protocols.
- the computer programs described above may also be arranged in part as a distributed application capable of running on several different computers or computer systems at more or less the same time. Programs as well as data such as energy related
- OPC optical coherence control
- OPC servers OPC servers
- object request broker such as COM, DCOM, or CORBA
- web service a web service
Landscapes
- Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method to process and display control instructions and technical information for an equipment, plant or process in an industrial facility. A software entity may be configured with identities of selected said equipment, plant or processes. The software entity may also retrieve information associated with said equipment, plant or process by means of being so configured. Information may be combined and annotated on a display device to provide control or maintenance instructions. A display device, a computer program and a control system are also described.
Description
Method and system for processing and displaying maintenance or control instructions
TECHNICAL AREA
This invention relates to an improved method and system for providing, processing and displaying maintenance and/or control instructions for an equipment, plant or a process by means of accessing data and stored information and displaying information relevant to the device or process. The invention provides improved means to generate and display instructions necessary to configure, maintain or repair equipment.
TECHNICAL BACKGROUND
Monitoring and control of plant and equipment in industrial facilities is largely automated and computerised. A traditional "automation system is one in which each physical asset is described in a database. In its turn, this database is
frequently modeled as a multitude of tables, and relationships between different fields and tables in the database. Although much technical information about various devices/ equipment, process sections etc exists, it is most often distributed among different systems each having different categories of data and/or different methods of storing and retrieving technical data. Examples of such existing and often unconnected systems are computerized maintenance systems, process control systems, power management systems, energy management systems, and systems for process simulations and optimization.
Examples of information to be found in such systems are
maintenance histories, production histories, configuration set- points, production flows, compressor efficiency curves,
references to load characteristics of drive solution, energy consumption logs, ratings for electric motors, information about
material costs and energy costs. An installation may also include equipment from different suppliers and from different industries. It is a complex and difficult task to retrieve information from all of those systems so as to give a reliable and extensive picture even on a historical basis. It is also very difficult, time consuming and error prone to organize, enter, maintain and retrieve information related to a specific device. It is even more difficult to retrieve and or access such information when an alarm or other event is reported.
From the international patent application WO 01/02953 entitled Method of integrating an application in a computerized system, it is known to represent real world objects in control systems by means of one or more software components. The real world object may be a single device, an object in a process or complete equipment.
Use of augmented reality in industrial control or monitoring is known from WO 2005/066744 entitled A Virtual Control Panel, in which it is described how a computer produced image, an
augmented reality image, may be overlaid on a real world image. Manipulation of the composite image comprising real
world/augmented reality data may be carried out to provide an interaction with a control device or system.
However the prior art does not provide comprehensive access to plant, device or process section information in a timely and simple way that supports fast or on-line operations. In
addition, technical documentation and support materials for maintenance operations, repairs, configuration and/or other technical adjustments to a process, plant, or other production or processing site are often stored in different print and or digital media and are also often difficult to keep up-to-date.
SUMMARY OF THE INVENTION
According to one aspect of the present invention, one or more embodiments of the invention provide an improved method for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for
monitoring and control, wherein device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising identifying a said equipment, making and processing an image of a first view of the said equipment, making, retrieving or recording an annotation of control or maintenance instructions for said equipment and arranging the annotation represented as a graphic element for a display, and by combining the first view of the image and the annotation into a composite image, and displaying the composite image of said equipment combined with additional control or maintenance instructions to a user.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for monitoring and control, wherein device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising recording an annotation and wherein the annotation for said equipment is placed in a selected position relative to the first view of the image .
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, device or
equipment in an industrial process or facility comprising one or more control systems for monitoring and control, wherein device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising generating the annotation as a part of a layer for superposition on the first view of the image.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for monitoring and control, wherein device or process-related information and other data for each said equipment, plant or process are stored in and may be retrieved by said control system, the method comprising arranging the annotation displayed relative a position on first view of the image or relative a position on the display screen.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising displaying an instance of the composite image
displayed on a second display to a second user or expert.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising recording an annotation which provided by an input action of a second user or expert such as manipulating a second instance of the composite image displayed on the second display.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising adding one or more annotations are added to the composite image and displaying the image to a plurality of other users .
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising in part providing control or maintenance instructions used to configure a set point or switch a device on or off by means of software means of part of the composite image.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising in part activating a software means and thereby providing control or maintenance instructions used to generate one or more control signals.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising identifying said equipment by image processing a unique marker means on or near said equipment .
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising identifying said equipment by image processing images provided by the optical means or by processing the images in part by scanning natural features of the images.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising changing the position and/or orientation of the annotation in the composite image relative a distance or size relationship in a second view of the image dependent on a distance or size relationship determined with an image of a first view.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising changing both the position and size of the annotation relative to, and depending on, a second view of the image dependent on the size relationship determined with the first view.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising identifying said equipment by processing
transmissions from one or more wireless node means arranged on or near said equipment.
According to another embodiment of the present invention an improved method is provided for processing and displaying control or maintenance instructions for an apparatus, the method comprising generating instructions and information for carrying out a repair, re-configure, re-programming or replacement of a faulty part of said equipment, plant or process.
According to another aspect of the present invention, one or more embodiments of the invention provide an improved system for
processing and displaying control or maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems for
monitoring and control, and one or more computers, wherein device or process-related information and other data for each said equipment, plant or process are stored and may be retrieved from said control system, and comprising an AR system comprising a tracking means, a display means, at least one computer means, a data communication means for communicating an image of a first view of said equipment, and computer software image processing means for identifying the said equipment in the industrial process or facility, and computer software means to record control or maintenance instructions comprising any of text, video or voice messages and computer software for attaching said message information, and computer software means to provide a composite image of an image of said equipment combined with additional control or maintenance instructions to a user.
The invention comprises an improved system for processing and displaying an augmented reality representation of computer- generated graphical information, generated and stored in advance, overlaid the real world. Such an improved system may also comprise a communication system supporting voice
communication and transfer of the entire, or parts of, augmented reality interface and/or additional virtual information.
In summary, such an improved augmented reality system may comprise: a handheld interacting and pointing device with a tracking system for determining its position and orientation in relation to a world coordinate system and a portable display device eg preferably a wearable device such as glasses, head- mounted display or head-up display or else a PDA, notebook, tablet computer, or similar, for visualizing augmented reality overlaid the view of the real world. In the event that video
see-through is used for augmented reality visualization, the display device further comprises: a camera, such as a video camera of some sort for capturing a stream of images of the environment mounted on or integrated with the display device. The camera is mounted in a fixed position at the display device, and the display device is preferably located along the camera view axis and at the camera's image plane. The system combines computer-generated graphics and annotations with the live video stream and projects the combined augmented reality video onto the display device. The user sees the video with overlaid virtual information as if he was looking at the real world. For optical see-through AR, the computer-generated graphics are registered directly onto the display device and follow, on the image viewed, and on the user's view of the real world. The virtual graphics are overlaid on the optical image of the real world without including a video the real world. One or a series of still images may be used to carry out the same or similar steps or actions with the stream of video images described above .
To identify the equipment of interest, unique identity markers, ID markers, may be placed at objects (equipment, devices and systems) in the environment. These are preferably processed by a recognition system for recognizing the unique IDs and use those IDs to identify a known device or equipment controlled by a control system. Further a tracking system is included for determining position and orientation of a tracking device in relation to a world coordinate system. As well, or as an
alternative, a recognition system for recognizing and tracking one or more visible objects at a physical location may be used. For example a vision-based tracking system using a recognition or tracking technique known as Natural Feature tracking may be used. For more detail about a type of natural feature tracking see for example an article entitled: Natural Feature Tracking for Extendible Robust Augmented Realities, Park, You & Neuman,
UCLA, in a paper for the first International Workshop on
Augmented Reality (IWAR) June 1998. This type of technique may be used with image processing to recognise one or more objects in the real world and supply information about viewer-to-object distance, and changes in that distance, so that a position of the user in relation to certain real world objects may be determined. The determinations or calculations may then be used provide a graphic display depending in part on how the position of the user and the orientation of the user changes relative to a given real object.
The processing of the image may be speeded up by combining a calculation of a movement path of the camera/viewer based on changes in the viewer-to-object distance for one or more objects. Calculations are advantageously carried out by
recognising in the visual images natural features such as corners, edges, or other higher-contrast features identified in a first view and then, in successive views, calculating a movement path and orientation based on the viewer-to-object distance for features already identified in the first view or other earlier scan. In this way, by identifying initially and then not carrying out the identifying processes each time, or at least not in a foreground process, and calculating from the change position in the image if the natural feature (s) already identified a movement path and orientation of the user which enables quick identification of an equipment in a surroundings and faster processing, faster refresh for example, of the composite image.
A major advantage of the present invention is that a maintenance action to respond to a new alarm or other event may be handled in a more timely way. Collaboration between and contact with internal users, technicians, operators and/or engineers with known and recorded technical information and/or expertise may be
carried out quickly and effectively. The engineer, operator or other user has a portable display device or portable computing and display device. This may comprise devices such as a PDA, as a head-up display arranged as glasses or on an adapted helmet. Preferably a camera or other position tracking means is mounted on or near the glasses, so capturing quite closely the user's view of the equipment, including the user's position and orientation, etc. Communication is preferably carried out using a head-set, mobile phone, phone and hands-free type attachment so that the user may communicate with other users, experts, and at the same time have one or more hands free for manipulating the display and or making an adjustment to a control panel or component etc. of the equipment, plant or process.
Collaboration with other internal users known to have the technical information and/or expertise that has been determined to be relevant to the equipment is enabled. Where necessary external experts or consultants may also be quickly identified and/or contacted. Information already stored in one or more different places or databases may be brought together in one situation or meeting room or electronic document and information or annotations from internal or external experts added to the document to provide the information and or instructions
necessary to carry out the control, maintenance or eg
configuration task in question. Annotations may be made in real time to information, text or graphics displayed to each user and or expert collaborating over the issue. Annotations or
attachments in the form of text annotations, digital files, video clips, sound clips may be attached to a shared electronic document or notice board, shared display screen means, virtual white board etc. The information retrieved and/or developed during collaboration provides instruction to the user for monitoring, troubleshooting, configuring or repairing the equipment .
The method and system are facilitated by one or more computer programs in the control system and a computer architecture with which each user may be configured in a database, log-in file, table or similar to record one or more types of technical expertise he/she is known to possess and one or more types of communication means to the expert. When a software entity representing a component part of the equipment, plant or process in the control system is selected, a second software entity may be activated to access and retrieve information, especially contact information, phone numbers, IP phone numbers,
workstation URLs, GUIDs etc, for all internal (or external) users with expertise on the given component.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention will now be described, by way of example only, with particular reference to the accompanying drawings in which:
FIGURE 1 shows a schematic diagram of an overview of a system to generate control and/or maintenance instructions for an
industrial device or process according to an embodiment of the invention.
FIGURE 2 is a schematic diagram overview of a system to generate control and/or maintenance instructions for an industrial device or process according to another embodiment of the invention. FIGURE 3 is a schematic diagram of a screen display showing a selection on a composite image provided to generate control and/or maintenance instructions for an industrial device or process according to an embodiment of the invention;
FIGURE 4 is a schematic diagram of a screen display showing a selection on a composite image provided to generate control and/or maintenance instructions for an industrial device or process according to another embodiment of the invention;
FIGURE 5 is a flowchart for a method for an improved way to access, annotate and display information for generating control and/or maintenance instructions for an industrial device or process according to an embodiment of the invention;
FIGURE 6 is a flowchart for a method for an improved way to access, share, collaborate and display shared information for generating control and/or maintenance instructions for an industrial device or process according to another embodiment of the invention.
FIGURES 7-9 shows a schematic diagram of a view of another screen display showing how an annotation may be fixed relative a point on a first image, according to an embodiment of the invention.
FIGURE 10 shows a schematic diagram of a view of a screen display showing how an annotation may be displayed as fixed relative a point on a display screen.
DETAILED DESCRIPTION OF THE EMBODIMENTS
Figure 1 shows an overview for an improved system according to an embodiment of the invention. It shows a user 1, a device or equipment 7 to be monitored, repaired or controlled, and a display 3. The user holds a visible pointer 4 of some sort and a wearable computer 2. The user has a camera 6 of some sort preferably arranged mounted so as to record an image from the position and orientation direction that the user is looking in. The user has the display device arranged in the form of a head- up display 3, glasses or video glasses. A schematic view as seen by user 1 of the composite image 5 provided in display device 3 is shown. Signal, data and image information is
communicated to the display device and/or the control system 12, databases 14, by the wearable computer 2 via a data network 10.
The view seen by the user comprises an image, or a composite image, of a first image 7' of the real world equipment and a view 4' of the pointer 4 relative to the image 7' of the real world equipment. In addition any of a text 9a, a diagram 9b or an annotation 9c may be superimposed over the image 7' of the equipment. The composite image 5 displayed to user 1 may
comprise a combination of an image 7 ' of the equipment of interest, an image of some information in the form of an image or text 9a, and, a diagram or drawing 9b and an added text, sketch, graphic or diagram 9b that has been added to the other two images as an annotation or attachment .
Figure 2 shows schematically a user 1, a wearable computer 2, a first display device 3a and a second display device 3b. The equipment object 7, the device or process of interest, is shown arranged with a marker 8 for the purpose of identifying it. The user may have a camera integrated with the first display 3a worn by the user 1 as a monocular viewer, glasses or head-up display. The computing functions may be provided by the wearable computer 2, by the second or alternative display 3b, the glasses or head- up display 3a, and/or may be distributed among these devices.
Figure 3 shows schematically another embodiment of the improved system. The figure shows the composite image 5 seen by user 1 comprising a view of the equipment 7 of interest. The view shows an image 4' of the pointer 4 held by the user displayed relative the equipment. Information, retrieved information 9a, 9b and added annotations 9c from one or more other users or experts may each be superimposed on any part of the composite image of the equipment. The annotations may be configured or arranged to be fixed relative a part of the first view of the equipment. In this case, the annotations move and change size or even
disappear from the composite image on the display when the user turns his/her head or changes position. In this way, the
annotations are always associated with a particular part of the equipment or of the image (see also description and Figures 7- 10) .
The annotations may otherwise configured or arranged. For example the annotations may be in a fixed position relative a part of the computer display frame or apparatus screen. In this case, the or certain of the annotations stay on screen
independent of how the user changes his/her orientation or position. In this way, particular critical or selected
annotations are always available for the user to see. For example a virtual control panel or an on-line parameter reading may be continuously displayed even when the user moves his head and looks at other views .
Referring again to Figure 3. In the composite image 5 displayed to the user a virtual control panel 30 for the equipment 7 as identified to the control system by, for example, image
processing a unique marker 8, may be retrieved from data storage accessible by the control system, generated, and provided superimposed as an annotation on the image 7' of the equipment. When the user look at the real world equipment, the user sees the virtual control panel (VCP) as if it were attached to the real world equipment 7 because his head position is tracked by AR tracking means such as camera 6 and the image recorded and displayed to the user is adjusted accordingly. In this case the VCP may be arranged always fixed to a selected part of the equipment image, or may be arranged visible on the display while the user orientation and position change.
The virtual control panel 30 comprises a plurality of input or control elements, which may include a mode or status indicator 33, part name or number 34, one or more parameters or set-points 35, selection or activity indicators 88 and control means or
control buttons 37. In the figure it is schematically shown that the virtual control panel may have a Status or Mode set to On Line, such that real time values for parameters such as % load, case temperature, rotational speed in rpm and supply current in amps etc are displayed. Other Modes may be selected such as for example Modes such as Off Line, On Line or Trend. Mode Trend provides for data retrieval from the control system and presents historical data or calculates predicted data or trends
respectively for the equipment of interest.
The user 1 may monitor and inspect a condition of an equipment 7 using an embodiment of the improved system. The user selects an equipment, which is identified in the control system by means of a marker 8 or alternative means, for example scanning methods as described below. User 1 may then retrieve information relevant to the identified equipment by means of the control system 12, stored in different formats in one or more databases 14, and examine the information. The user, an online expert or another person may record annotations by voice or text or freehand notes and attach those annotations 9c.
Other selected local or remote users or local or remote experts may share the composite image seen by the user 1 for the purpose of collaboration. The other users may add information in the form of notes or voice recordings or video clips which are attached to the composite image and shared by all selected users. Figure 3 shows schematically attached text information 9a, attached freehand notation 9b, attached diagram or flowchart 9c. The attached information which may have been contributed by other experts or users, called here annotations, may be used by user 1 as instructions or additional to carry out actions such as: carry out an inspection, adjust a set point, control a device or process, switch a device on or off. These actions are carried out by the user 1 by means of switching or operating
buttons 37 or selection means 38 on the virtual control panel 30. This done by the user 1 moving the image 4' of pointer 4 viewed in the composite image 5 to select those buttons, by the action of moving the actual pointer 4 in the real world. As shown in Figure 3 a user may then inspect a condition of a motor and access or retrieve examine real time values for parameters such as speed, temperature, and load. Alternatively the user may make control inputs and or actions using a known industrial control or process control system to input set-points, switch a device on or off etc..
Thus in an aspect a method of an embodiment of the invention may comprise that the user 1 receives technical information or additional technical information from a local or internal colleague relevant to a device, apparatus or problem that the person is trained for or responsible for.
The method according to an embodiment of the invention comprises the steps of that a maintenance person requiring technical information picks up or preferably puts on a user AR equipment, which comprise be a PDA, wearable computer, headset based device, comprising a display 3, 3a, 3b. To deal with an incoming alarm, the control system 12 may generate information instructing the logged on maintenance person where in the plant to go and what the problem is (the system may indicate a new alarm via text, a visual or graphic display, or text-to- speech (TTS) . The maintenance person or user can ask for more info via text or voice. The maintenance person goes to a physical location and or a location of a plant or process section (functional location) indicated by the system, observes the problem, alarm or other event. The AR equipment may be used to recognise an equipment of interest 7 by means of: an asset number or similar, a wireless node mechanism, a marker 8 installed for the purpose of identifying the equipment to an AR
system, a method of scanning equipment surroundings and processing the image to identify natural features to match to a predetermined location of the equipment, or a combination of equipment identity in the incoming alarm information together with any of the above. When the equipment of interest has been identified by one or other method to the control system the maintenance person or user can then access stored and/or real time information associated with the equipment through pre- configured associations of the control system.
The maintenance person can view service history and
documentation, such as for example:
-lists with instructions from the supplier of the particular object,
-top 5 problems for this particular device or item,
-view or listen to user or other operator notes recorded at an earlier date by a maintenance person or other operator who has fixed problems in the past (may even be his own notes) . The user can enter a new text or record a voice operator note or record a sound from the equipment etc regarding the problem.
If the maintenance person cannot fix the problem on his own, and needs help/support, he activates or mouse-clicks software means in the control system associated with the device or problem of interest. The system finds an appropriate online expert (where navigation options may include using a history of previous similar problems or a maintenance history of the selected device, process) or the user may select a specific person, and contact is then established by computer and/or by phone in a suitable way.
The information stored on the specific person or expert includes a unique address, internally with IP addresses and workstation addresses of users internal to the plan, and, preferably for
external experts an IP address, URL, phone number etc of a support database, preferably an ABB database, and information about the local system to be used in support requests.
The second person agrees to accept the support request and receives access to the alarm or event, and gets an overview of at least the latest event as a starting point. Voice contact may be established automatically, for example via voice-over-IP (VoIP) . Options are provided for application sharing and/or to send pictures or files, and in particular for the selected specific person to share the composite image 5 seen by the user, and to add notes, comments, other information which are attached to the composite image as annotations. The second person, depending on user log-in information and/or authority or
privilege data, be enabled to also control the on-screen pointer image 4' by any computer input means suitable for selecting a control element of the VCP displayed by a graphical user
interface.
Annotations may include the possibility of attaching video, voice or pictures to a support request, and retrieving a "buddy- list" to give an overview of available local experts (Figure 5) . It may be desirable to add some logic to the list so that local contacts with knowledge about a specific problem are given priority, and so that the logic establishes which users or experts are now available, logged in, on call, suitable but not available, when available and why. An operational difference exists between local experts and remote (eg ABB Support)
experts. This is that support from remote experts is most often asynchronous at first, whilst contact with a local expert is typically synchronous from the beginning.
Figure 5 shows a first flowchart for a method which comprises the following steps:
51 User is notified by control system 12 in some way of an alarm or event
52 User takes AR inspection system, goes to the process, and an equipment of interest is identified to the control system
53 User examines equipment and the AR system displays a
composite image 5 of the real equipment combined with AR
information
55 Information such as documentation, annotations recorded in sound, picture or video form, a virtual control panel and so on is retrieved by software means of the control system 12 from databases such as 14,
56 User examines the retrieved information, may request other information, and decides if the present alarm or event situation is already known or described
59 If already described the user arranges for a repair of adjustment or replacement etc,
57 If not already described, records annotations of status, parameters, set points or other data by sound recording, text input, freehand sketch and so on,
58 User may retrieve and examine a list of internal operators, experts, and/or external experts with any of whom the user may choose to seek information or instructions for controlling an equipment or process .
Figure 6 shows a second flowchart for a method which comprises the following steps:
61 User chooses one or more internal and/or external experts
63 Request communicated to chosen internal and/or external expert contact means, IP phone, video phone, URL etc addresses 64 Expert (s) respond or agree via messaging system, SMS, e-mail, phone etc to help
65 Internal and/or external expert receives access to latest and all event details, access to related information via the control system, including latest annotations 57 added by user,
67 User and expert (s) discuss available technical inspection data, status and other information in the composite image 5, retrieve other information, exchange information and so on,
68 User and expert (s) record equipment configuration changes or set point changes, inspection data, pictures, sounds, voice, text or handwriting via a suitable input touch screen or stylus etc device,
69 After documenting technical data user arranges a fix or adjustment by actions such as switching on or off equipment or generating a control signal or changing one or more set-points via the virtual control panel 30 or via the control system 12, or making changes via a control unit physically at the location, and/or arrange for other maintenance actions, or repair or replacement as necessary.
Figure 4 shows another embodiment of the improved system. The figure shows another composite image 5 seen by user 1 comprising a view of the equipment 7 of interest. The view shows an image 4' of the pointer 4 held by the user displayed relative the equipment. Information, retrieved information 9a, 9b and added annotations 9c from one or more other users or experts may each be superimposed on the composite image of the equipment. The view of the equipment 7 of interest includes a unique marker 8 positioned in the view and, in this case, positioned on the actual equipment itself. The marker may be identified by using an image processing software to detect presence of a marker and identity of the marker on the equipment of interest.
In the composite image 5 displayed to the user a virtual control panel 30 for the equipment 7 as in the control system identified
by marker 8 may be retrieved from data storage, generated, and provided as superimposed on the image 7' of the equipment. As described above in relation to Figure 3, the virtual control panel 30 comprises a plurality of control elements, which may include a mode or status indicator 33, part name or number 34, one or more parameters or set-points 35, selection or activity indicators 88 and control means or control buttons 37. In the figure it is schematically shown that the virtual control panel may have a Status or Mode set to Edit, such that set-point values for motor parameters such as: case temperature,
rotational speed in rpm and supply current in amps .
A collaborative session between the user and other local or remote engineers, specialists or experts may result in
information produced and attached to the composite image 5. The figure also represents an example of such method steps, in that a first 9b annotation was provided giving instructions, in this case for the user to "select Edit" mode, and a dotted
representation of the pointer image 4' applied to a control element of the VCP to select that Edit mode. Secondly another annotation 9b is made instructing the user to "increase RPM to 200" ie for the user to change the motor speed set-point and increase from 150 to 200 rpm. Pointer image 4' with solid lines indicates pointer image 4' positioned on a control element and activating the control element.
Other Modes may be selected such as for example Modes such as Off Line or Trend which retrieves from the control system and presents historical data or predicted data respectively for the equipment .
The virtual or augmented information is annotated and linked to the first view 7 of the equipment. Up to now the equipment has been described as identified in this case by a unique marker 8
arranged on or near the equipment, which is then matched with stored data about process devices, equipment etc.. Thus as shown in Figures 7-9 an annotation 9d is associated with part of of a control panel in a first image 7 of the equipment. In figure 8 the view 17 of the equipment has changed, but the annotation is still fixed to the same part of the equipment image, despite that the view or orientation of the user has changed. Similarly in Figure 9 the equipment view 27 is
different from first view 7, but the annotation is still
associated with the selected equipment part, so that the user sees the annotation whenever the equipment shown in the first view 7 is visible to him. If the user looks way the annotation vanishes. If the user looks back to the equipment the annotation re-appears because it is associated or fixed to a selected part of the first image. Figure 10 illustrates an annotation form
Figure 8 arranged fixed relative the screen display, so that as the user changes orientation and position the annotation remains in view in the same place.
In another embodiment the annotation 9 in the augmented reality view of the composite image equipment 7', I'1 may not be associated to a selected part of the equipment or to any part of the displayed image and may instead "float" or remain in view indefinitely even when the user changes position and orientation completely. Thus an annotation may remain on the display even when the user goes to another part of the plant, if the user so desires .
In another embodiment the unique marker is identified by means other than by image processing to recognise a sign, symbol, number or other visual marker. Non-visual markers such as IR readable signs, or markers using other means such as acoustic transmissions may be adapted for use. Equipment identity may also be established using one or more purpose-specific wireless
transmitters in the vicinity of the equipment of interest, or by using wireless communications based on wireless transmitters forming part of a data network in the vicinity.
In another embodiment the equipment is identified for tracking purposes and tracked relative a view of the real world using an optical mapping technique that involves identifying a series of objects or features in two or more repeated images taken by a camera or video camera of the equipment and its surroundings. An image processing software is used to identify objects detected optically, track the same objects over a succession of images, and calculate a position and orientation of the camera or sensor receiving and taking the images. In this way, the position and orientation of the user with a tracking apparatus 6, figures 1, 2 can be calculated. The image processing may be based on recognising and natural features, especially features with relatively higher contrast to their background, and subsequently calculating, based on the change in a distance between those natural features in an image, a position of the viewer relative the equipment or other object. Changes in the position and size of the annotation relative the equipment due to any change in position or orientation of the user's head are calculated in this way and the size and position of the annotation relative the equipment (see Figures 7-9) is displayed accordingly.
The image processing may be based in part on a Natural Features method. This type of method concerns identifying points on an object or in a surrounding, points such as edges or corners that show some visual contrast. As the scanner or camera moves orientation and point of view an image processing program tracks identified points in one or more successive frames. From these identified points and the paths the points move along the position and orientation of the scanner or camera is calculated. This position data is in turn used to determine, based on
position or location data stored for the various process devices, equipment in the control system, which equipment in the plant is being looked at or inspected etc on the basis of the position so estimated. By this means the equipment can be identified without the use of a identifying marker when stored position data for devices, equipment at the location are available
In another embodiment the selected part of the equipment of interest is identified and tracked in a view of the real world by measuring a distance to one or more wireless node means arranged on or near said equipment. Transmissions from one or more wireless nodes comprised, for example, in a wireless LAN or in an ad-hoc wireless network, in a Bluetooth pico-net or a ZigBee network may be used to calculate a relative physical location based on signal strength of other radio signal
characteristics. From measurements of relative distance the equipment may be mapped to location data stored in a control system of the industrial plant.
The methods of the invention may be carried out by means of one or more computer programs comprising computer program code or software portions running on a computer, data server or a processor. The microprocessor (or processors) comprises a central processing unit CPU performing the steps of the method according to one or more facets of the invention, such as the methods shown in Figures 5, 6. The methods are performed with the aid of one or more said computer programs, which are stored at least in part in memory accessible by the one or more processors. For example a program or part-program that carries out some or all of the steps shown and described in relation to in Figures 5 and 6 may be run by a computer or processor of the control system. At least one of the or each processors may be in a central object oriented control system in a local or
distributed computerised control system. It is to be understood that said computer programs may also be run on one or more general purpose industrial microprocessors or computers instead of one or more specially adapted computers or processors .
The computer program comprises computer program code elements or software code portions that make the computer perform the method using equations, algorithms, data, stored values and
calculations previously described. A part of the program may be stored in a processor as above, but also in a ROM, RAM, PROM, EPROM, or EEPROM chip or similar memory means. The program in part or in whole may also be stored on, or in, other suitable computer readable medium such as a magnetic disk, CD-ROM or DVD disk, hard disk, magneto-optical memory storage means, in volatile memory, in flash memory, as firmware, stored on a data server or on one or more arrays of data servers . Other known and suitable media, including removable memory media such as removable flash memories, hard drives etc. may also be used.
Data may also be communicated wirelessly, at least in part, to portable devices carried or worn by a user. Wireless
communications may be carried out using any suitable protocol, including a wireless telephone system such as GSM or GPRS.
Short range radio communication is a preferred technology, using a protocol compatible with, standards issued by the Bluetooth Special Interest Group (SIG), any variation of IEEE-802.11, WiFi, WiMax, Ultra Wide Band (UWB), ZigBee or IEEE-802.15.4, IEEE-802.13 or equivalent or similar. Wireless communication may also be carried out using Infra Red (IR) means and protocols such as IrDA, IrCOMM or similar. Wireless communication may also be carried out using sound or ultrasound transducers, through the air or via work object construction, pure magnetic or electric fields (capacitive or inductive communication) or other
types of light, such as for example LED, laser, as communication media with standard or proprietary protocols.
The computer programs described above may also be arranged in part as a distributed application capable of running on several different computers or computer systems at more or less the same time. Programs as well as data such as energy related
information may each be made available for retrieval, delivery or, in the case of programs, execution over the Internet. Data and/or methods may be accessed by software entities or other means of the control system by means of any of the lost of: OPC, OPC servers, an object request broker such as COM, DCOM, or CORBA, or a web service.
It is also noted that while the above describes exemplifying embodiments of the invention, there are several variations and modifications which may be made to the disclosed solution without departing from the scope of the present invention as defined in the appended claims .
Claims
1. Method for processing and displaying control or maintenance instructions for an apparatus, device or equipment (7) in an industrial process or facility comprising one or more control systems (12) for monitoring and control, wherein device or process-related information and other data for each said
equipment, plant or process are stored and may be retrieved by said control system, characterised by
-identifying a said equipment (7) and making and processing an image of a first view (7') of the said equipment,
-making, retrieving or recording an annotation (9, 9a-c) of control or maintenance instructions for said equipment and arranging the annotation represented as an active or passive graphic element for a display,
-combining the first view of the image and the annotation into a composite image (5) , and
-displaying the composite image of said equipment combined with additional control or maintenance instructions to a user.
2. A method according to claim 1, wherein the annotation for said equipment is placed in the display at a selected position relative to a part of the first view of the image.
3. A method according to claim 2, wherein the annotation is generated as a part of a layer for superposition on the first view of the image.
4. A method according to claim 2 or 3 , wherein the annotation is placed in the display at a selected position relative to a position on the display screen.
5. A method according to claim 1, wherein a display of an instance of the composite image is displayed on a second display to a second user or expert.
6. A method according to claim 5 , wherein an annotation is provided by an action of a second user or expert manipulating a second instance of the composite image displayed on the second display.
7. A method according to claim 6, wherein data relevant to said equipment in the form of one or more annotations is added to the composite image and displayed to a plurality of other users.
8. A method according to any of claims 1-7, wherein part (30, 4', 37) of the composite image (5) comprises software means for generating control or maintenance instructions and is arranged with means to configure a set point, to generate a control signal or to switch a device on or off, in response to user activation (4' ) .
9. A method according to claim 1, wherein said equipment is identified by unique marker means (8) on or near said equipment.
10. A method according to claim 9, wherein said equipment is identified in part by applying an image processing software to the unique marker means (8) .
11. A method according to claim 1, wherein said equipment is identified in part from images produced by optical means (6) .
12. A method according to claim 11, wherein said equipment is identified in part by applying an image processing software to the images provided by the optical means.
13. A method according to claim 12 , wherein the images are processed in part by scanning natural features of said
equipment, other objects and/or surroundings of said equipment.
14. A method according to claim 13 , wherein the images are processed in part by calculating a movement path of the
camera/viewer based on changes calculated in a viewer-to-object distance for one or more natural features of said equipment, other objects and/or surroundings of said equipment.
15. A method according to any of claims 1-14, wherein a size relationship between two or more parts (V, 4') of the first view of the image is determined and used to calculate a position and or orientation of a user (1) .
16. A method according to claim 15, wherein the position and/or orientation of the annotation relative a second view of the image is changed dependent on the distance relationship or a relative size relationship determined with the first view.
17. A method according to claim 16, wherein either or both the position and size of the annotation are changed depending on a calculating image part distance relationships in the second view of the image dependent on the distance relationship between parts of the image, or a relative size relationship, determined for the first view.
18. A method according to any of claims 15-17, wherein a movement path of the user gaze and orientation is calculated during present and successive views depending on the distance or size relationship of parts of the image with each other
determined for a previous view.
19. A method according to claim 1, wherein said equipment is identified by processing transmissions from one or more wireless node means arranged on or near said equipment .
20. A method according to any previous claim, characterised by generating instructions and information for carrying out any member of the group consisting of: repair, re-configure, re- programming or replacement of a faulty part of said equipment, plant or process .
21. A computer program for processing and displaying control or maintenance instructions for an apparatus, device or equipment
(7) in an industrial process or facility comprising one or more control systems (12) for monitoring and control, comprising computer code means and/or software code portions which when run on a computer or processor will make said computer or processor perform the steps of a method according any of claims 1-20.
22. A computer program comprising a computer program according to claim 21 comprised in one or more computer readable media.
23. A system for processing and displaying control or
maintenance instructions for an apparatus, device or equipment in an industrial process or facility comprising one or more control systems (12) for monitoring and control, and one or more computers, wherein device or process-related information and
other data for each said equipment, plant or process are stored and may be retrieved from said control system,
characterised by
-an AR system comprising a tracking means (6) , a display means (3, 3a, 3b) at least one computer means (2), a data
communication means (10) for communicating an image (7') of a first view (7) of said equipment,
-computer software image processing means for identifying the said equipment in the industrial process or facility,
-computer software means to record control or maintenance instructions comprising any of text, video or voice messages and computer software for attaching said message information, and
-computer software means to provide a composite image of an image of said equipment combined with additional control or maintenance instructions to a user.
24. A system according to claim 23, wherein at least one of augmented reality system comprises a handheld interacting and pointing device (4) .
25. A system according to claim 24, wherein the tracking system (6) comprises means for determining position and orientation of the handheld device.
26. A system according to claim 25, wherein the tracking system (6) comprises means for determining position and orientation of the handheld device based on one or more identifying markers (8) .
27. A system according to claim 25, wherein the tracking system comprises means for determining position and orientation of the
handheld device based on processing an image comprising one or more identifying markers (8) .
28. A system according to claim 25, wherein the tracking system comprises means for determining position and orientation of the handheld device based on identifying one or more natural features in the view of said equipment or surroundings thereto.
29. A system according to claim 25, wherein the tracking system comprises means for processing an image to determine position and orientation of the handheld device based on spatial
relationships between the one or more natural features in the view of said equipment, other objects or surroundings thereto and with the handheld device.
30. A system according to claim 29, wherein the one or more visible features in the view of said equipment or surroundings comprise one or more edges or corners.
31. A system according to claim 23, wherein the system comprises a portable or wearable display device for visualizing a composite image comprising augmented reality overlaid on a real world view, which may be any form the list of: a monocular device, glasses, goggles, head-mounted display, helmet mounted display, head-up display.
32. A system according to any of claims 23-31, wherein the system comprises a data communication means (10) comprises a data network or LAN which may include one or more wireless nodes.
33. A system according to claim 32, wherein the system comprises a computer software means for arranging a position of an
annotation (9, 9a, 9b, 9c) relative part of an image or relative part of a display means of a display device (3, 3a, 3b) .
34. A system according to claim 23, characterised by comprising computer program means according to any of claims 21, 22.
35. Use of a system for maintenance and/or control in an industrial process or device according to any of claims 24-28 for providing control or maintenance instructions in response to an alarm or event in a plant or process in a facility comprising a plurality of devices and one or more control systems for process monitoring and control .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2005/003709 WO2007066166A1 (en) | 2005-12-08 | 2005-12-08 | Method and system for processing and displaying maintenance or control instructions |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2005/003709 WO2007066166A1 (en) | 2005-12-08 | 2005-12-08 | Method and system for processing and displaying maintenance or control instructions |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007066166A1 true WO2007066166A1 (en) | 2007-06-14 |
Family
ID=36617247
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2005/003709 WO2007066166A1 (en) | 2005-12-08 | 2005-12-08 | Method and system for processing and displaying maintenance or control instructions |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2007066166A1 (en) |
Cited By (82)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009036782A1 (en) * | 2007-09-18 | 2009-03-26 | Vrmedia S.R.L. | Information processing apparatus and method for remote technical assistance |
DE102008012122A1 (en) * | 2008-03-01 | 2009-09-03 | Rittal Gmbh & Co. Kg | Testing device for switch cabinet or rack in information technology, has assignment device automatically assigning accessed data to switch cabinet or rack or section of cabinet or rack, where data are represented to part by display unit |
EP2273429A1 (en) * | 2009-07-06 | 2011-01-12 | Siemens Aktiengesellschaft | Globally usable multimedia communication and support system for assembling, inspecting, maintaining and repairing technical systems and method |
WO2011120624A1 (en) * | 2010-03-30 | 2011-10-06 | Khs Gmbh | Mobile maintenance unit |
WO2012137108A1 (en) * | 2011-04-07 | 2012-10-11 | International Business Machines Corporation | Managing computing systems utilizing augmented reality |
EP2626756A1 (en) * | 2012-02-10 | 2013-08-14 | Weber Maschinenbau GmbH Breidenbach | Device with augmented reality |
EP2642331A1 (en) * | 2012-03-21 | 2013-09-25 | Converteam Technology Ltd | Display and Control Systems |
WO2013156342A1 (en) * | 2012-04-20 | 2013-10-24 | Siemens Aktiengesellschaft | Determining the location of a component in an industrial system using a mobile operating device |
WO2013170871A1 (en) * | 2012-05-14 | 2013-11-21 | Abb Research Ltd | Method and industrial control system for industrial process equipment maintenance |
WO2014000832A1 (en) * | 2012-06-25 | 2014-01-03 | Robert Bosch Gmbh | Control device for a field device, and system and method for starting up a field device |
WO2014048759A1 (en) * | 2012-09-27 | 2014-04-03 | Krones Ag | Method for supporting operating and changeover processes |
EP2728846A1 (en) * | 2012-11-06 | 2014-05-07 | Konica Minolta, Inc. | Guidance information display device |
US20140152530A1 (en) * | 2012-12-03 | 2014-06-05 | Honeywell International Inc. | Multimedia near to eye display system |
WO2014127836A1 (en) * | 2013-02-25 | 2014-08-28 | Abb Technology Ltd | Method and device for monitoring and controlling an industrial process |
WO2014184288A1 (en) * | 2013-05-15 | 2014-11-20 | Thales | Maintenance assistance device and maintenance method using such a device |
WO2015001233A1 (en) * | 2013-07-03 | 2015-01-08 | Snecma | Augmented reality method and system for monitoring |
ITBO20130466A1 (en) * | 2013-08-29 | 2015-03-01 | Umpi Elettronica Societa A Respo Nsabilita Lim | METHOD OF INSPECTION AND / OR MAINTENANCE OF A PART OF AN INDUSTRIAL PLANT BY INCREASED REALITY, AND CORRESPONDING SYSTEM TO GUIDE THE INSPECTION AND / OR MAINTENANCE OF THE INDUSTRIAL PLANT |
EP2626758A3 (en) * | 2012-02-10 | 2015-03-04 | Fisher-Rosemount Systems, Inc. | Methods for collaboratively assisting a control room operator |
EP2847753A1 (en) * | 2012-05-11 | 2015-03-18 | Bosch Automotive Service Solutions LLC | Augmented reality virtual automotive x-ray having service information |
CN104508712A (en) * | 2012-05-15 | 2015-04-08 | 西门子公司 | Verification of component in industrial plant by means of mobile operating device |
EP2652940A4 (en) * | 2010-12-16 | 2015-05-20 | Microsoft Technology Licensing Llc | Comprehension and intent-based content for augmented reality displays |
WO2015101393A1 (en) * | 2013-12-30 | 2015-07-09 | Telecom Italia S.P.A. | Augmented reality for supporting intervention of a network apparatus by a human operator |
WO2015160515A1 (en) | 2014-04-16 | 2015-10-22 | Exxonmobil Upstream Research Company | Methods and systems for providing procedures in real-time |
CN105103068A (en) * | 2013-04-18 | 2015-11-25 | 欧姆龙株式会社 | Work management system and work management method |
EP2996015A1 (en) * | 2014-09-09 | 2016-03-16 | Schneider Electric IT Corporation | Method to use augmented reality to function as hmi display |
US9323325B2 (en) | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
EP3086193A1 (en) * | 2015-04-24 | 2016-10-26 | JPW Industries Inc. | Wearable display for use with tool |
DE102015006632A1 (en) * | 2015-05-21 | 2016-11-24 | Audi Ag | Method for operating a diagnostic system and diagnostic system for a motor vehicle |
WO2017100554A1 (en) * | 2015-12-11 | 2017-06-15 | ecoATM, Inc. | Systems and methods for recycling consumer electronic devices |
US9740935B2 (en) | 2013-11-26 | 2017-08-22 | Honeywell International Inc. | Maintenance assistant system |
EP3214586A1 (en) * | 2016-03-04 | 2017-09-06 | Thales Deutschland GmbH | Method for maintenance support and maintenance support system |
ITUA20162756A1 (en) * | 2016-04-20 | 2017-10-20 | Newbiquity Sagl | METHOD AND SERVICE SYSTEM AT DISTANCE IN REAL TIME WITH THE USE OF COMPUTER VISION AND INCREASED REALITY |
WO2017182523A1 (en) * | 2016-04-20 | 2017-10-26 | Newbiquity Sagl | A method and a system for real-time remote support with use of computer vision and augmented reality |
WO2017186450A1 (en) * | 2016-04-26 | 2017-11-02 | Krones Ag | Operating system for a machine of the food industry |
CN107422686A (en) * | 2016-04-06 | 2017-12-01 | 劳斯莱斯电力工程有限公司 | Equipment for allowing the remote control to one or more devices |
US9885672B2 (en) | 2016-06-08 | 2018-02-06 | ecoATM, Inc. | Methods and systems for detecting screen covers on electronic devices |
US9911102B2 (en) | 2014-10-02 | 2018-03-06 | ecoATM, Inc. | Application for device evaluation and other processes associated with device recycling |
WO2018054976A1 (en) * | 2016-09-23 | 2018-03-29 | Philips Lighting Holding B.V. | A building automation system with servicing beacon |
US9965564B2 (en) | 2011-07-26 | 2018-05-08 | Schneider Electric It Corporation | Apparatus and method of displaying hardware status using augmented reality |
US9965841B2 (en) | 2016-02-29 | 2018-05-08 | Schneider Electric USA, Inc. | Monitoring system based on image analysis of photos |
DE102017123940A1 (en) | 2016-10-14 | 2018-05-09 | Blach Verwaltungs GmbH + Co. KG | Augmented reality at extruder plant |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
EP3264209A4 (en) * | 2015-02-24 | 2018-07-25 | Hallys Corporation | Mechanical device management system, mechanical device management device, server for managing mechanical device, mechanical device, and mechanical device management method |
US10127647B2 (en) | 2016-04-15 | 2018-11-13 | Ecoatm, Llc | Methods and systems for detecting cracks in electronic devices |
US10182153B2 (en) | 2016-12-01 | 2019-01-15 | TechSee Augmented Vision Ltd. | Remote distance assistance system and method |
US10223832B2 (en) | 2011-08-17 | 2019-03-05 | Microsoft Technology Licensing, Llc | Providing location occupancy analysis via a mixed reality device |
US10269110B2 (en) | 2016-06-28 | 2019-04-23 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
DE102017219067A1 (en) * | 2017-10-25 | 2019-04-25 | Bayerische Motoren Werke Aktiengesellschaft | DEVICE AND METHOD FOR THE VISUAL SUPPORT OF A USER IN A WORKING ENVIRONMENT |
DE102017130137A1 (en) * | 2017-12-15 | 2019-06-19 | Endress+Hauser SE+Co. KG | Method for simplified commissioning of a field device |
DE102017130138A1 (en) * | 2017-12-15 | 2019-06-19 | Endress+Hauser SE+Co. KG | Method for simplified commissioning of a field device |
EP3502836A1 (en) * | 2017-12-21 | 2019-06-26 | Atos Information Technology GmbH | Method for operating an augmented interactive reality system |
WO2019123187A1 (en) * | 2017-12-20 | 2019-06-27 | Nws Srl | Virtual training method |
US10379607B2 (en) | 2012-07-30 | 2019-08-13 | Agilent Technologies, Inc. | Experimental chamber with computer-controlled display wall |
US10397404B1 (en) | 2016-12-01 | 2019-08-27 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
US10401411B2 (en) | 2014-09-29 | 2019-09-03 | Ecoatm, Llc | Maintaining sets of cable components used for wired analysis, charging, or other interaction with portable electronic devices |
US10417615B2 (en) | 2014-10-31 | 2019-09-17 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
DE102018204152A1 (en) * | 2018-03-19 | 2019-09-19 | Homag Gmbh | System for virtual support of an operator for woodworking machines |
US10445708B2 (en) | 2014-10-03 | 2019-10-15 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US10475002B2 (en) | 2014-10-02 | 2019-11-12 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US10560578B2 (en) | 2016-12-01 | 2020-02-11 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
US10567583B2 (en) | 2016-12-01 | 2020-02-18 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
US10567584B2 (en) | 2016-12-01 | 2020-02-18 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
US10572946B2 (en) | 2014-10-31 | 2020-02-25 | Ecoatm, Llc | Methods and systems for facilitating processes associated with insurance services and/or other services for electronic devices |
WO2020049231A1 (en) | 2018-09-06 | 2020-03-12 | Sidel Participations | Method for computer assistance in the management of a production line |
EP3660609A1 (en) * | 2018-11-09 | 2020-06-03 | Liebherr-Verzahntechnik GmbH | Control panel |
WO2020146972A1 (en) * | 2019-01-14 | 2020-07-23 | Covestro Deutschland Ag | Method and system for controlling of an injection molding process |
US10789775B2 (en) | 2016-07-15 | 2020-09-29 | Beckhoff Automation Gmbh | Method for controlling an object |
US10860990B2 (en) | 2014-11-06 | 2020-12-08 | Ecoatm, Llc | Methods and systems for evaluating and recycling electronic devices |
US11010841B2 (en) | 2008-10-02 | 2021-05-18 | Ecoatm, Llc | Kiosk for recycling electronic devices |
US11080672B2 (en) | 2014-12-12 | 2021-08-03 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US11080662B2 (en) | 2008-10-02 | 2021-08-03 | Ecoatm, Llc | Secondary market and vending system for devices |
US11107046B2 (en) | 2008-10-02 | 2021-08-31 | Ecoatm, Llc | Secondary market and vending system for devices |
US11112135B2 (en) | 2018-11-09 | 2021-09-07 | Johnson Controls Technology Company | Maintenance procedure updating in HVAC system service log |
US11127210B2 (en) | 2011-08-24 | 2021-09-21 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US11247869B2 (en) | 2017-11-10 | 2022-02-15 | Otis Elevator Company | Systems and methods for providing information regarding elevator systems |
US11462868B2 (en) | 2019-02-12 | 2022-10-04 | Ecoatm, Llc | Connector carrier for electronic device kiosk |
US11482067B2 (en) | 2019-02-12 | 2022-10-25 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
US11526932B2 (en) | 2008-10-02 | 2022-12-13 | Ecoatm, Llc | Kiosks for evaluating and purchasing used electronic devices and related technology |
US11798250B2 (en) | 2019-02-18 | 2023-10-24 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
US11922467B2 (en) | 2020-08-17 | 2024-03-05 | ecoATM, Inc. | Evaluating an electronic device using optical character recognition |
US11989710B2 (en) | 2018-12-19 | 2024-05-21 | Ecoatm, Llc | Systems and methods for vending and/or purchasing mobile phones and other electronic devices |
US12033454B2 (en) | 2020-08-17 | 2024-07-09 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001002953A1 (en) * | 1999-07-06 | 2001-01-11 | Abb Ab | Method of integrating an application in a computerized system |
US20020191002A1 (en) * | 1999-11-09 | 2002-12-19 | Siemens Ag | System and method for object-oriented marking and associating information with selected technological components |
US20040046711A1 (en) * | 2000-12-18 | 2004-03-11 | Siemens Ag | User-controlled linkage of information within an augmented reality system |
US20050021281A1 (en) * | 2001-12-05 | 2005-01-27 | Wolfgang Friedrich | System and method for establising a documentation of working processes for display in an augmented reality system in particular in a production assembly service or maintenance enviroment |
WO2005066744A1 (en) * | 2003-12-31 | 2005-07-21 | Abb Research Ltd | A virtual control panel |
-
2005
- 2005-12-08 WO PCT/IB2005/003709 patent/WO2007066166A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001002953A1 (en) * | 1999-07-06 | 2001-01-11 | Abb Ab | Method of integrating an application in a computerized system |
US20020191002A1 (en) * | 1999-11-09 | 2002-12-19 | Siemens Ag | System and method for object-oriented marking and associating information with selected technological components |
US20040046711A1 (en) * | 2000-12-18 | 2004-03-11 | Siemens Ag | User-controlled linkage of information within an augmented reality system |
US20050021281A1 (en) * | 2001-12-05 | 2005-01-27 | Wolfgang Friedrich | System and method for establising a documentation of working processes for display in an augmented reality system in particular in a production assembly service or maintenance enviroment |
WO2005066744A1 (en) * | 2003-12-31 | 2005-07-21 | Abb Research Ltd | A virtual control panel |
Non-Patent Citations (1)
Title |
---|
ULRICH NEUMANN ET AL: "Natural Feature Tracking for Augmented Reality", IEEE TRANSACTIONS ON MULTIMEDIA, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 1, no. 1, March 1999 (1999-03-01), XP011036279, ISSN: 1520-9210 * |
Cited By (161)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2009112063A2 (en) | 2007-09-18 | 2009-09-17 | Vrmedia S.R.L. | Information processing apparatus and method for remote technical assistance |
WO2009112063A3 (en) * | 2007-09-18 | 2009-11-05 | Vrmedia S.R.L. | Information processing apparatus and method for remote technical assistance |
WO2009036782A1 (en) * | 2007-09-18 | 2009-03-26 | Vrmedia S.R.L. | Information processing apparatus and method for remote technical assistance |
DE102008012122B4 (en) * | 2008-03-01 | 2014-09-11 | Rittal Gmbh & Co. Kg | Testing device for control cabinets or racks |
DE102008012122A1 (en) * | 2008-03-01 | 2009-09-03 | Rittal Gmbh & Co. Kg | Testing device for switch cabinet or rack in information technology, has assignment device automatically assigning accessed data to switch cabinet or rack or section of cabinet or rack, where data are represented to part by display unit |
US11080662B2 (en) | 2008-10-02 | 2021-08-03 | Ecoatm, Llc | Secondary market and vending system for devices |
US11790328B2 (en) | 2008-10-02 | 2023-10-17 | Ecoatm, Llc | Secondary market and vending system for devices |
US11010841B2 (en) | 2008-10-02 | 2021-05-18 | Ecoatm, Llc | Kiosk for recycling electronic devices |
US12198108B2 (en) | 2008-10-02 | 2025-01-14 | Ecoatm, Llc | Secondary market and vending system for devices |
US12182773B2 (en) | 2008-10-02 | 2024-12-31 | Ecoatm, Llc | Secondary market and vending system for devices |
US11935138B2 (en) | 2008-10-02 | 2024-03-19 | ecoATM, Inc. | Kiosk for recycling electronic devices |
US11907915B2 (en) | 2008-10-02 | 2024-02-20 | Ecoatm, Llc | Secondary market and vending system for devices |
US11107046B2 (en) | 2008-10-02 | 2021-08-31 | Ecoatm, Llc | Secondary market and vending system for devices |
US11526932B2 (en) | 2008-10-02 | 2022-12-13 | Ecoatm, Llc | Kiosks for evaluating and purchasing used electronic devices and related technology |
WO2011003727A1 (en) * | 2009-07-06 | 2011-01-13 | Siemens Aktiengesellschaft | Multimedia communication and support system that can be used worldwide for assembly, inspection, maintenance, and repair assignments in technical facilities, and method |
CN102473250A (en) * | 2009-07-06 | 2012-05-23 | 西门子公司 | Multimedia communication and support system and method for assembly, inspection, maintenance and repair work in technical installations, which can be used worldwide |
EP2273429A1 (en) * | 2009-07-06 | 2011-01-12 | Siemens Aktiengesellschaft | Globally usable multimedia communication and support system for assembling, inspecting, maintaining and repairing technical systems and method |
WO2011120624A1 (en) * | 2010-03-30 | 2011-10-06 | Khs Gmbh | Mobile maintenance unit |
US9213405B2 (en) | 2010-12-16 | 2015-12-15 | Microsoft Technology Licensing, Llc | Comprehension and intent-based content for augmented reality displays |
EP2652940A4 (en) * | 2010-12-16 | 2015-05-20 | Microsoft Technology Licensing Llc | Comprehension and intent-based content for augmented reality displays |
US9219665B2 (en) | 2011-04-07 | 2015-12-22 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
US8990385B2 (en) | 2011-04-07 | 2015-03-24 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
US9391860B2 (en) | 2011-04-07 | 2016-07-12 | Globalfoundries, Inc.. | Systems and methods for managing computing systems utilizing augmented reality |
US8918494B2 (en) | 2011-04-07 | 2014-12-23 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
WO2012137108A1 (en) * | 2011-04-07 | 2012-10-11 | International Business Machines Corporation | Managing computing systems utilizing augmented reality |
US9219666B2 (en) | 2011-04-07 | 2015-12-22 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
GB2505099A (en) * | 2011-04-07 | 2014-02-19 | Ibm | Managing computing systems utilizing augmented reality |
US8738754B2 (en) | 2011-04-07 | 2014-05-27 | International Business Machines Corporation | Systems and methods for managing computing systems utilizing augmented reality |
US9712413B2 (en) | 2011-04-07 | 2017-07-18 | Globalfoundries Inc. | Systems and methods for managing computing systems utilizing augmented reality |
US9965564B2 (en) | 2011-07-26 | 2018-05-08 | Schneider Electric It Corporation | Apparatus and method of displaying hardware status using augmented reality |
US10223832B2 (en) | 2011-08-17 | 2019-03-05 | Microsoft Technology Licensing, Llc | Providing location occupancy analysis via a mixed reality device |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
US11127210B2 (en) | 2011-08-24 | 2021-09-21 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US9323325B2 (en) | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
EP2626756A1 (en) * | 2012-02-10 | 2013-08-14 | Weber Maschinenbau GmbH Breidenbach | Device with augmented reality |
US9785133B2 (en) | 2012-02-10 | 2017-10-10 | Fisher-Rosemount Systems, Inc. | Methods for collaboratively assisting a control room operator |
EP2626758A3 (en) * | 2012-02-10 | 2015-03-04 | Fisher-Rosemount Systems, Inc. | Methods for collaboratively assisting a control room operator |
EP2642331A1 (en) * | 2012-03-21 | 2013-09-25 | Converteam Technology Ltd | Display and Control Systems |
WO2013156342A1 (en) * | 2012-04-20 | 2013-10-24 | Siemens Aktiengesellschaft | Determining the location of a component in an industrial system using a mobile operating device |
EP2847753A1 (en) * | 2012-05-11 | 2015-03-18 | Bosch Automotive Service Solutions LLC | Augmented reality virtual automotive x-ray having service information |
EP2847753A4 (en) * | 2012-05-11 | 2015-12-16 | Bosch Automotive Service Solutions Llc | Augmented reality virtual automotive x-ray having service information |
WO2013170871A1 (en) * | 2012-05-14 | 2013-11-21 | Abb Research Ltd | Method and industrial control system for industrial process equipment maintenance |
CN104508712A (en) * | 2012-05-15 | 2015-04-08 | 西门子公司 | Verification of component in industrial plant by means of mobile operating device |
CN104508712B (en) * | 2012-05-15 | 2017-06-20 | 首要金属科技德国有限责任公司 | By the component in moving operation unit check industrial equipment |
WO2014000832A1 (en) * | 2012-06-25 | 2014-01-03 | Robert Bosch Gmbh | Control device for a field device, and system and method for starting up a field device |
US10642354B2 (en) | 2012-07-30 | 2020-05-05 | Agilent Technologies, Inc. | Experimental chamber with computer-controlled display wall |
US10379607B2 (en) | 2012-07-30 | 2019-08-13 | Agilent Technologies, Inc. | Experimental chamber with computer-controlled display wall |
CN104704432A (en) * | 2012-09-27 | 2015-06-10 | 克朗斯股份公司 | Method for supporting operating and changeover processes |
WO2014048759A1 (en) * | 2012-09-27 | 2014-04-03 | Krones Ag | Method for supporting operating and changeover processes |
US9760168B2 (en) | 2012-11-06 | 2017-09-12 | Konica Minolta, Inc. | Guidance information display device |
EP2728846A1 (en) * | 2012-11-06 | 2014-05-07 | Konica Minolta, Inc. | Guidance information display device |
US20140152530A1 (en) * | 2012-12-03 | 2014-06-05 | Honeywell International Inc. | Multimedia near to eye display system |
WO2014127836A1 (en) * | 2013-02-25 | 2014-08-28 | Abb Technology Ltd | Method and device for monitoring and controlling an industrial process |
US9720402B2 (en) | 2013-02-25 | 2017-08-01 | Abb Schweiz Ag | Method and device for monitoring and controlling an industrial process |
CN104995573A (en) * | 2013-02-25 | 2015-10-21 | Abb技术有限公司 | Method and device for monitoring and controlling an industrial process |
CN104995573B (en) * | 2013-02-25 | 2018-04-13 | Abb 技术有限公司 | Method and apparatus for monitoring and controlling industrial process |
EP2988184A4 (en) * | 2013-04-18 | 2016-07-13 | Omron Tateisi Electronics Co | Work management system and work management method |
US9953374B2 (en) | 2013-04-18 | 2018-04-24 | Omron Corporation | Work management system and work management method |
EP3001268A3 (en) * | 2013-04-18 | 2016-07-13 | Omron Corporation | Work management system and work management method |
US9959578B2 (en) | 2013-04-18 | 2018-05-01 | Omron Corporation | Work management system and work management method |
US9953376B2 (en) | 2013-04-18 | 2018-04-24 | Omron Corporation | Work management system and work management method |
EP3001270A3 (en) * | 2013-04-18 | 2016-07-06 | Omron Corporation | Work management system and work management method |
US10019764B2 (en) | 2013-04-18 | 2018-07-10 | Omron Corporation | Work management system and work management method |
EP3001269A3 (en) * | 2013-04-18 | 2016-07-06 | Omron Corporation | Work management system and work management method |
EP3001267A3 (en) * | 2013-04-18 | 2016-07-06 | Omron Corporation | Work management system and work management method |
US9953375B2 (en) | 2013-04-18 | 2018-04-24 | Omron Corporation | Work management system and work management method |
CN105103068A (en) * | 2013-04-18 | 2015-11-25 | 欧姆龙株式会社 | Work management system and work management method |
FR3005819A1 (en) * | 2013-05-15 | 2014-11-21 | Thales Sa | MAINTENANCE SUPPORT DEVICE AND MAINTENANCE METHOD USING SUCH A DEVICE |
WO2014184288A1 (en) * | 2013-05-15 | 2014-11-20 | Thales | Maintenance assistance device and maintenance method using such a device |
WO2015001233A1 (en) * | 2013-07-03 | 2015-01-08 | Snecma | Augmented reality method and system for monitoring |
FR3008210A1 (en) * | 2013-07-03 | 2015-01-09 | Snecma | METHOD AND SYSTEM FOR INCREASED REALITY FOR SUPERVISION |
WO2015028978A1 (en) * | 2013-08-29 | 2015-03-05 | Umpi Elettronica - Societa' A Responsabilita' Limitata | Method and system for the inspection and/or the maintenance of an electrical panel by means of augmented reality |
ITBO20130466A1 (en) * | 2013-08-29 | 2015-03-01 | Umpi Elettronica Societa A Respo Nsabilita Lim | METHOD OF INSPECTION AND / OR MAINTENANCE OF A PART OF AN INDUSTRIAL PLANT BY INCREASED REALITY, AND CORRESPONDING SYSTEM TO GUIDE THE INSPECTION AND / OR MAINTENANCE OF THE INDUSTRIAL PLANT |
US9740935B2 (en) | 2013-11-26 | 2017-08-22 | Honeywell International Inc. | Maintenance assistant system |
EP2876484B1 (en) * | 2013-11-26 | 2019-03-06 | Honeywell International Inc. | Maintenance assistant system |
WO2015101393A1 (en) * | 2013-12-30 | 2015-07-09 | Telecom Italia S.P.A. | Augmented reality for supporting intervention of a network apparatus by a human operator |
US10171628B2 (en) | 2013-12-30 | 2019-01-01 | Telecom Italia S.P.A. | Augmented reality for supporting intervention of a network apparatus by a human operator |
WO2015160515A1 (en) | 2014-04-16 | 2015-10-22 | Exxonmobil Upstream Research Company | Methods and systems for providing procedures in real-time |
EP2996015A1 (en) * | 2014-09-09 | 2016-03-16 | Schneider Electric IT Corporation | Method to use augmented reality to function as hmi display |
US10401411B2 (en) | 2014-09-29 | 2019-09-03 | Ecoatm, Llc | Maintaining sets of cable components used for wired analysis, charging, or other interaction with portable electronic devices |
US9911102B2 (en) | 2014-10-02 | 2018-03-06 | ecoATM, Inc. | Application for device evaluation and other processes associated with device recycling |
US11734654B2 (en) | 2014-10-02 | 2023-08-22 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US11126973B2 (en) | 2014-10-02 | 2021-09-21 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US10496963B2 (en) | 2014-10-02 | 2019-12-03 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US10475002B2 (en) | 2014-10-02 | 2019-11-12 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US10438174B2 (en) | 2014-10-02 | 2019-10-08 | Ecoatm, Llc | Application for device evaluation and other processes associated with device recycling |
US12217221B2 (en) | 2014-10-02 | 2025-02-04 | Ecoatm, Llc | Wireless-enabled kiosk for recycling consumer devices |
US11790327B2 (en) | 2014-10-02 | 2023-10-17 | Ecoatm, Llc | Application for device evaluation and other processes associated with device recycling |
US11232412B2 (en) | 2014-10-03 | 2022-01-25 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US11989701B2 (en) | 2014-10-03 | 2024-05-21 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US10445708B2 (en) | 2014-10-03 | 2019-10-15 | Ecoatm, Llc | System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods |
US12205081B2 (en) | 2014-10-31 | 2025-01-21 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US10572946B2 (en) | 2014-10-31 | 2020-02-25 | Ecoatm, Llc | Methods and systems for facilitating processes associated with insurance services and/or other services for electronic devices |
US11436570B2 (en) | 2014-10-31 | 2022-09-06 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US10417615B2 (en) | 2014-10-31 | 2019-09-17 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US10860990B2 (en) | 2014-11-06 | 2020-12-08 | Ecoatm, Llc | Methods and systems for evaluating and recycling electronic devices |
US11080672B2 (en) | 2014-12-12 | 2021-08-03 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US12008520B2 (en) | 2014-12-12 | 2024-06-11 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
US11315093B2 (en) | 2014-12-12 | 2022-04-26 | Ecoatm, Llc | Systems and methods for recycling consumer electronic devices |
EP3264209A4 (en) * | 2015-02-24 | 2018-07-25 | Hallys Corporation | Mechanical device management system, mechanical device management device, server for managing mechanical device, mechanical device, and mechanical device management method |
CN106066633A (en) * | 2015-04-24 | 2016-11-02 | Jpw工业有限公司 | The wearable display device being used together with lathe |
US9972133B2 (en) | 2015-04-24 | 2018-05-15 | Jpw Industries Inc. | Wearable display for use with tool |
US10685494B2 (en) | 2015-04-24 | 2020-06-16 | Jpw Industries Inc. | Wearable display for use with tool |
EP3086193A1 (en) * | 2015-04-24 | 2016-10-26 | JPW Industries Inc. | Wearable display for use with tool |
DE102015006632A1 (en) * | 2015-05-21 | 2016-11-24 | Audi Ag | Method for operating a diagnostic system and diagnostic system for a motor vehicle |
EP4099667A1 (en) * | 2015-12-11 | 2022-12-07 | ecoATM, LLC | Systems and methods for recycling consumer electronic devices |
WO2017100554A1 (en) * | 2015-12-11 | 2017-06-15 | ecoATM, Inc. | Systems and methods for recycling consumer electronic devices |
US9965841B2 (en) | 2016-02-29 | 2018-05-08 | Schneider Electric USA, Inc. | Monitoring system based on image analysis of photos |
WO2017149120A1 (en) * | 2016-03-04 | 2017-09-08 | Thales Deutschland Gmbh | Method for maintenance support and maintenance support system |
EP3214586A1 (en) * | 2016-03-04 | 2017-09-06 | Thales Deutschland GmbH | Method for maintenance support and maintenance support system |
CN107422686A (en) * | 2016-04-06 | 2017-12-01 | 劳斯莱斯电力工程有限公司 | Equipment for allowing the remote control to one or more devices |
CN107422686B (en) * | 2016-04-06 | 2022-05-03 | 劳斯莱斯电力工程有限公司 | Apparatus for enabling remote control of one or more devices |
US10127647B2 (en) | 2016-04-15 | 2018-11-13 | Ecoatm, Llc | Methods and systems for detecting cracks in electronic devices |
ITUA20162756A1 (en) * | 2016-04-20 | 2017-10-20 | Newbiquity Sagl | METHOD AND SERVICE SYSTEM AT DISTANCE IN REAL TIME WITH THE USE OF COMPUTER VISION AND INCREASED REALITY |
WO2017182523A1 (en) * | 2016-04-20 | 2017-10-26 | Newbiquity Sagl | A method and a system for real-time remote support with use of computer vision and augmented reality |
US11199830B2 (en) | 2016-04-26 | 2021-12-14 | Krones Ag | Operating system for a machine of the food industry |
WO2017186450A1 (en) * | 2016-04-26 | 2017-11-02 | Krones Ag | Operating system for a machine of the food industry |
CN109074048A (en) * | 2016-04-26 | 2018-12-21 | 克朗斯股份公司 | The operating system of machine for food industry |
US9885672B2 (en) | 2016-06-08 | 2018-02-06 | ecoATM, Inc. | Methods and systems for detecting screen covers on electronic devices |
US10269110B2 (en) | 2016-06-28 | 2019-04-23 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
US10909673B2 (en) | 2016-06-28 | 2021-02-02 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
US11803954B2 (en) | 2016-06-28 | 2023-10-31 | Ecoatm, Llc | Methods and systems for detecting cracks in illuminated electronic device screens |
US10789775B2 (en) | 2016-07-15 | 2020-09-29 | Beckhoff Automation Gmbh | Method for controlling an object |
US11003152B2 (en) | 2016-09-23 | 2021-05-11 | Signify Holding B.V. | Building automation system with servicing beacon |
WO2018054976A1 (en) * | 2016-09-23 | 2018-03-29 | Philips Lighting Holding B.V. | A building automation system with servicing beacon |
CN109791397A (en) * | 2016-09-23 | 2019-05-21 | 昕诺飞控股有限公司 | Building automation system with maintenance beacon |
CN109791397B (en) * | 2016-09-23 | 2022-07-01 | 昕诺飞控股有限公司 | Building automation system with maintenance beacon |
DE102017123940A1 (en) | 2016-10-14 | 2018-05-09 | Blach Verwaltungs GmbH + Co. KG | Augmented reality at extruder plant |
US10182153B2 (en) | 2016-12-01 | 2019-01-15 | TechSee Augmented Vision Ltd. | Remote distance assistance system and method |
US10560578B2 (en) | 2016-12-01 | 2020-02-11 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
US10805466B2 (en) | 2016-12-01 | 2020-10-13 | TechSee Augmented Vision Ltd. | Remote distance assistance system and method |
US10397404B1 (en) | 2016-12-01 | 2019-08-27 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
US11323568B2 (en) | 2016-12-01 | 2022-05-03 | TechSee Augmented Vision Ltd. | Remote distance assistance system and method |
US10567583B2 (en) | 2016-12-01 | 2020-02-18 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
US10313523B2 (en) | 2016-12-01 | 2019-06-04 | TechSee Augmented Vision Ltd. | Remote distance assistance system and method |
US10567584B2 (en) | 2016-12-01 | 2020-02-18 | TechSee Augmented Vision Ltd. | Methods and systems for providing interactive support sessions |
DE102017219067A1 (en) * | 2017-10-25 | 2019-04-25 | Bayerische Motoren Werke Aktiengesellschaft | DEVICE AND METHOD FOR THE VISUAL SUPPORT OF A USER IN A WORKING ENVIRONMENT |
US11247869B2 (en) | 2017-11-10 | 2022-02-15 | Otis Elevator Company | Systems and methods for providing information regarding elevator systems |
DE102017130138A1 (en) * | 2017-12-15 | 2019-06-19 | Endress+Hauser SE+Co. KG | Method for simplified commissioning of a field device |
EP3724612B1 (en) * | 2017-12-15 | 2023-01-18 | Endress+Hauser SE+Co. KG | Method for starting up a field device in a simplified manner |
DE102017130137A1 (en) * | 2017-12-15 | 2019-06-19 | Endress+Hauser SE+Co. KG | Method for simplified commissioning of a field device |
US11454942B2 (en) | 2017-12-15 | 2022-09-27 | Endress+Hauser SE+Co. KG | Method for starting up a field device in a simplified manner |
US11454533B2 (en) | 2017-12-15 | 2022-09-27 | Endress+Hauser SE+Co. KG | Method for starting up a field device in a simplified manner |
WO2019123187A1 (en) * | 2017-12-20 | 2019-06-27 | Nws Srl | Virtual training method |
CN111512250A (en) * | 2017-12-20 | 2020-08-07 | Nws(股份)责任有限公司 | Virtual training method |
EP3502836A1 (en) * | 2017-12-21 | 2019-06-26 | Atos Information Technology GmbH | Method for operating an augmented interactive reality system |
DE102018204152A1 (en) * | 2018-03-19 | 2019-09-19 | Homag Gmbh | System for virtual support of an operator for woodworking machines |
WO2020049231A1 (en) | 2018-09-06 | 2020-03-12 | Sidel Participations | Method for computer assistance in the management of a production line |
EP3660609A1 (en) * | 2018-11-09 | 2020-06-03 | Liebherr-Verzahntechnik GmbH | Control panel |
US11112135B2 (en) | 2018-11-09 | 2021-09-07 | Johnson Controls Technology Company | Maintenance procedure updating in HVAC system service log |
US11989710B2 (en) | 2018-12-19 | 2024-05-21 | Ecoatm, Llc | Systems and methods for vending and/or purchasing mobile phones and other electronic devices |
CN113423554A (en) * | 2019-01-14 | 2021-09-21 | 科思创知识产权两合公司 | Method and system for controlling an injection molding process |
EP3924161A4 (en) * | 2019-01-14 | 2022-11-30 | Covestro Intellectual Property GmbH & Co. KG | Method and system for controlling of an injection molding process |
WO2020146972A1 (en) * | 2019-01-14 | 2020-07-23 | Covestro Deutschland Ag | Method and system for controlling of an injection molding process |
US11462868B2 (en) | 2019-02-12 | 2022-10-04 | Ecoatm, Llc | Connector carrier for electronic device kiosk |
US11843206B2 (en) | 2019-02-12 | 2023-12-12 | Ecoatm, Llc | Connector carrier for electronic device kiosk |
US11482067B2 (en) | 2019-02-12 | 2022-10-25 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
US11798250B2 (en) | 2019-02-18 | 2023-10-24 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
US12223684B2 (en) | 2019-02-18 | 2025-02-11 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
US11922467B2 (en) | 2020-08-17 | 2024-03-05 | ecoATM, Inc. | Evaluating an electronic device using optical character recognition |
US12033454B2 (en) | 2020-08-17 | 2024-07-09 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2007066166A1 (en) | Method and system for processing and displaying maintenance or control instructions | |
CN113703569B (en) | System and method for virtual reality and augmented reality for industrial automation | |
CN108089696B (en) | Virtual reality and augmented reality for industrial automation | |
US8225226B2 (en) | Virtual control panel | |
US7787992B2 (en) | Method to generate a human machine interface | |
US8191005B2 (en) | Dynamically generating visualizations in industrial automation environment as a function of context and state information | |
US8290894B2 (en) | Web-based visualization mash-ups for industrial automation | |
US7110909B2 (en) | System and method for establishing a documentation of working processes for display in an augmented reality system in particular in a production assembly service or maintenance environment | |
Zhang et al. | RFID-assisted assembly guidance system in an augmented reality environment | |
US8026933B2 (en) | Visualization system(s) and method(s) for preserving or augmenting resolution and data associated with zooming or paning in an industrial automation environment | |
US20090086021A1 (en) | Dynamically generating real-time visualizations in industrial automation environment as a function of contect and state information | |
US20090089701A1 (en) | Distance-wise presentation of industrial automation data as a function of relevance to user | |
US20090088883A1 (en) | Surface-based computing in an industrial automation environment | |
JP7377318B2 (en) | work order system | |
US20090089682A1 (en) | Collaborative environment for sharing visualizations of industrial automation data | |
US20100257464A1 (en) | System and method for immersive operations intelligence | |
US20110134204A1 (en) | System and methods for facilitating collaboration of a group | |
CN106340217A (en) | Augmented reality technology based manufacturing equipment intelligent system and its implementation method | |
US20200042793A1 (en) | Creating, managing and accessing spatially located information utilizing augmented reality and web technologies | |
Becher et al. | Situated visual analysis and live monitoring for manufacturing | |
EP1975754B1 (en) | A computer implemented method to display technical data for monitoring an industrial installation | |
Klimant et al. | Augmented reality solutions in mechanical engineering | |
Url et al. | Practical insights on augmented reality support for shop-floor tasks | |
Permin et al. | Smart devices in production system maintenance | |
Francisco et al. | Augmented Reality and Digital Twin for Mineral Industry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05813837 Country of ref document: EP Kind code of ref document: A1 |